CSV Data Dump for 31 Day Model Run
The target company's hydraulic modelling package uses Innovyze InfoworksTM. This product enables third party integration through API’s and Ruby scripts when the ICM Exchange service is enabled. As a result, the research looked at opportunities to exploit scripting in order to run the chosen optimisation strategy.
The first approach initially investigated the use of a CS-script tool that would export the results tables directly from the Innovyze InfoworksTM environment into CSV format workbooks. From here the data could then be inspected, with the application of mathematical tooling to optimise the pump start parameters before returning these back into the model and rerunning.
Note, the computational resource the research obtained to deploy the modelling and analysis tools comprised the following specification.
Hardware
- Dell Poweredge R720
- Intel Xeon Processor E5-2600 v2
- 2x Processor Sockets
- 32GB Memory random access memory (RAM) – 1866MT/s
Virtual Machine
- Hosted on VMWare Hypervisor v6.0.
- Windows Server 2012R2.
- Microsoft Excel 64bit.
- 16 virtual-central-processing-units (V-CPU’s).
- Full provision of 32GB RAM – 1866MT/s.
were highlighted in the first round of data exports as, even with a dedicated
Issues
server offering 16-V-CPUs, and the specification as shown above, the Excel frontend environment was unable to process the very large data matrices being generated. There were regular failings of the Excel executable which led to an overall inability to inspect the data let alone run calculations on the matrices. When considering the five- second sample over 31 days this resulted in matrices in the order of [44x535682] per model run, with the calculations in (14-19) needing to be applied on a per cell basis.