The City of El Paso Water Utilities maintenance team was looking for a way to streamline the data collection for transient pressure events. The company had a significant investment in legacy transient pressure sensors throughout its water distribution network. The system was comprehensive but data collection had to be performed by physically driving to each location and manually downloading the data from the sensors for analysis.
The manual collection of data, while it has been the standard for many years, provided no real time visibility of how transient pressure events were impacting pipeline operations. As all data was collected after the event it was very labour intensive to collect data from other departments to try and correlate the cause of the transient event. Drilling into the data was also limited to the proprietary software provided by the manufacturer limiting what could be done with the data.
As we went through this process we had a number of challenges. The first challenge was to interface with the legacy sensor. In order to do this we needed to be able to mirror the commands being sent from the legacy mobile device to the sensor and download the data to our data collection device.
Secondly, in order to get good information out of the system we needed to analyze the data format provided and develop a method to structure the data in a format that could be streamed to the cloud while also preventing duplication. The data would then need to be presented to a web interface where the transient event could be viewed and also sent to a notification system to alert the operations group that the event was occuring.
Finally, As many transient pressure events were the result of a loss in power we needed to ensure that the sensor, data collection unit and data transmissions were not impacted by a power outage.
Due to the high cost of replacing the transient pressure sensor it was preferred that we use the existing sensors that were in place. The existing system had no programming interface so we had to develop our own interface to the sensor. We accomplished this by capturing the network traffic from the mobile device to the sensor and then mirroring those commands on our data collection device.
The data was stored in a proprietary file format that was designed specifically for the manufacturers use. As part of the process we needed to reverse engineer that data and then structure it in a format that could be streamed to the cloud. We also needed to develop a system to correlate the data coming from the sensor with the data streamed to the cloud to prevent duplication.
The existing system software provided an entire months worth of transient data. This resulted in a lot of manual effort to drill in and review each transient event. To streamline this process we intercepted and added a transient pressure event ID to the data during the process. By doing this we were able to create a simplified dashboard that presented each transient event that eliminated all of the manual efforts. The challenge we had was with ensuring the data integrity was maintained through the process.
During regular operation the sensor takes a sample of the pressure at one second intervals. When it detects that a transient event is occuring the sensor can take up to one thousand readings per second. Some transient events could last for up to a minute which means over sixty thousand events occured in a minute. Being able to stream a high volume of information while also actioning events resulted in us finding numerous bottlenecks in the process.
While we were a bit skeptical that interfacing to the existing system could be done, Lexcom was able to deliver a prototype of this solution to us within a few months of it being conceptualized. Even after the solution was delivered they continued to come up with new ideas to add value to the system. - Ed Fierro