Liberating Data for Enterprise Access and Use

Increasingly, data usage is the differentiator — the factor critical to the productivity, reliability, and sustainability of heavy assets.

The series of challenges common to heavy industries, including aging infrastructure, market volatility, and the wave of retirements among experienced personnel, are compelling organizations to look for ways to more fully derive value from their data.

Many companies are having difficulty just leveraging their operational technology (OT) data. By some estimates, only 5% of OT data is used right now. Of that 5%, even less of it is useful after wrangling, cleansing, and organizing. Gartner estimates that poor data quality alone contributes to a 30% loss in revenue per business on average.

In a variety of process-intensive industries, from chemicals, oil and gas to renewables, manufacturing, and mining, the primary barrier to uncovering the value of data is the outdated design of the data historian. Intended as on-premise collection systems, they are now being stretched to enable enterprise use of OT data. Excessive fees by tag and the number of users must be paid in order to make data more accessible.

More hardware is needed to scale, and less of it when a site adjusts production. No matter the case, high latency because of enterprise data demand complicates data collection by plant staff.

Whether internal or third parties eventually consume that data, the initial movement of OT data to the enterprise for consumption is slow-going and expensive.

And because it is imperative to make data-backed decisions, power users of on-premise automation and data historian systems no longer have a monopoly on asset-produced information. Real-time and historical data streaming from smart sensors into collection systems are of greater importance to various business functions. They all need context-rich OT data to make quicker, smarter business decisions and the cloud to provide the scalability of a data lake to do so.

With company-wide use of on-premise collection systems, organizations have encountered challenges in high data demand and cyber-risk. Multi-vendor access to critical plant systems leaves SCADA and on-premise systems struggling to securely process and store data, keeping IT in security limbo. It is a leading reason why a majority of industrial control systems experience a breach annually.

With the rise of cloud computing and industrial connectivity, a new solution has emerged to operationalize real-time, historical, and metadata in the cloud. Fusion leverages the advances in cybersecurity, scale, and interoperability of IT infrastructure to distribute access to OT data and analytics for high-value industrial intelligence across the enterprise—no limits by tags or number of users.

Share this Article
Other Articles
EPISODE NINETEEN: George Constantinescu, ATCO Group

Join host Dave Shook and special guest George Constantinescu, Executive Vice President and Chief Transformation Officer at ATCO. George shares his insight into business transformation and his journey at ATCO. As well as listen to their conversation about organizational transformation through the eyes of

EPISODE EIGHTEEN: Gayle Sheppard, Bright Machines

Host Dr. Dave Shook is joined by the CEO of Bright Machines, Gayle Sheppard. Gayle has been in the industry for 30 years. She shares her insights on digital transformation, her entire career journey, and what led her to become the CEO of Bright

EPISODE SIXTEEN: Marvin Wong, Secure Energy

This episode of “The Data Breakdown” features SECURE Energy’s Vice President  – Business Intelligence, Technology & Security, Marvin Wong. Marvin recaps his digital transformation journey and shares his insight on his managerial philosophies and the role that these philosophies play in creating innovation.