Why It’s Time to Stop Losing Data from Your Remote Sites

March 21, 2014

The following post provides an in-depth look at the problems facing the oil & gas industry with respect to data loss during remote data collection. What are the regulatory and financial implications? How do we start to solve this problem?

This post originally appeared on the Kepware Technologies Blog.

The following post provides an in-depth look at the problems facing the oil & gas industry with respect to data loss during remote data collection. What are the regulatory and financial implications? How do we start to solve this problem?

The Scenario

You’re the owner of a large compressor station in Longview, Texas. Your central office is in Houston, but the pipeline and compressor station is well outside of town.

It’s very important that you monitor operations at the site. You’re processing over 30 million cubic feet of gas per day, and running 1,000-plus horsepower engines to compress the lines. These engines are controlled and monitored by Programmable Logic Controllers (PLCs), which in turn are monitored by your SCADA system. Your lifeline to the site is a radio link.

READ ALSO: Plant Smarts–Making a case for investing in plant systems intelligence

The EPA requires you to collect emissions data off the engines every 15 minutes, and report on this data (40 CFR Subpart JJJJ). You also monitor alarm conditions of the equipment so you can be as proactive as possible in addressing issues and staying ahead of maintenance.

A large storm is headed your way and you get a bad feeling. Sure enough, the storm knocks out radio communications with the site for a few hours.

What happened?

You’ve just experienced data loss. It may not seem like a big deal until the EPA asks for their data and you don’t have it.

Data loss is a common problem in SCADA. Connection mediums like radio, satellite, 3G wireless, or even wired ISP lines, are inherently unreliable. It’s not a question of if you will lose connectivity, but when.

How did we get here?

We had to choose between having boots on the ground and monitoring remotely. Easy decision—monitor remotely. The simplest design is to have the SCADA poll the device directly, which is what we did. As a result, when connectivity is lost, data is lost.

At first, data loss was OK. Some data is better than nothing at all. However, as automation has picked up and we strive to be more efficient, we become more reliant on the data. Data loss is unacceptable, and in the above EPA case, illegal, and therefore expensive.

Another more recent factor is the use of PLCs at remote sites, as automation at these sites becomes more complex. PLCs are better equipped for automation, but were not designed for SCADA. The protocols and feature set don’t handle data loss well.

So what do we do today?

There are solutions in the market for the data loss problem today, but none are ideal. For example, you can modify a PLC to store process data, but this is custom and hard to maintain. PLC memory is also limited.

Protocols like DNP support buffering. This is a good solution, but it runs into trouble when you need to monitor a few hundred points and are limited on device memory. DNP is also not as mature in the market as other protocols, and some SCADA systems don’t support it.

Enterprise historian products have collectors at the site with store-and-forward capability. These tools are good at pushing data to the vendor’s historian, but that’s about it. From there, it can be pricey to get the data out.

What does the future look like?

I would argue that in the future, data collection and storage will be pushed out to the remote site for two reasons. First, the sites are increasing exponentially in complexity and the volume of data collected. As such, they are becoming more of a mini-SCADA or sub-system. Second, if we can’t rely on the connectivity, we won’t.

This mini-SCADA at the site will collect and store data. The central SCADA will then poll for real-time or historical data, or have this data pushed to it. Even better, the mini-SCADA will leverage open standards, allowing any central SCADA to get data from it.

Not only will this solve the data loss problem, it also has other benefits. For example, pushing historical data to the office can be scheduled and optimized, saving cost. The communications technology could use modern security and simplify the central SCADA by eliminating the need to support multiple protocols (i.e. MODBUS, DNP, DF1, and others). Push technology could also make the remote sites more secure, reducing the attack surface exposed to remote hackers.

We rely on things that are reliable. Once you can guarantee data collection, data can be relied upon to automate the business. This extends beyond oil & gas to other industries that rely on SCADA, like Water & Wastewater and Electric. Does it make sense for all remote sites be treated like mini-SCADA systems? No, but for critical sites, guaranteed data collection will soon be the norm, not the exception.

Aron Semle is a former software engineer and now a product manager at Kepware Technologies in Portland, Maine. He’s passionate about using new technology to solve modern day problems in data collection. Follow the Kepware Technologies blog at info.kepware.com/blog.

Sponsored Recommendations

Clean-in-Place (CIP) Solutions for Life Sciences Process Manufacturing

Learn how Emerson's measurement instrumentation can improve safety and reduce cross-contamination during CIP processes for life sciences process manufacturing.

Wireless Pressure Monitoring at Mining Flotation Cell

Eliminate operator rounds and improve flotation cell efficiency using reliable, wireless technology

Green hydrogen producer ensures quality of the network’s gas blend using a gas chromatograph

Case Study: Revolutionizing Green Hydrogen Blending with Precise Monitoring.

Overcome Measurement Challenges in Life Sciences

See how Emerson's best-in-class measurement instrumentation can help you overcome your toughest life sciences manufacturing challenges.