Liberating Data for Enterprise Access and Use

Increasingly, data usage is the differentiator — the factor critical to the productivity, reliability, and sustainability of heavy assets.

The series of challenges common to heavy industries, including aging infrastructure, market volatility, and the wave of retirements among experienced personnel, are compelling organizations to look for ways to more fully derive value from their data.

Many companies are having difficulty just leveraging their operational technology (OT) data. By some estimates, only 5% of OT data is used right now. Of that 5%, even less of it is useful after wrangling, cleansing, and organizing. Gartner estimates that poor data quality alone contributes to a 30% loss in revenue per business on average.

In a variety of process-intensive industries, from chemicals, oil and gas to renewables, manufacturing, and mining, the primary barrier to uncovering the value of data is the outdated design of the data historian. Intended as on-premise collection systems, they are now being stretched to enable enterprise use of OT data. Excessive fees by tag and the number of users must be paid in order to make data more accessible.

More hardware is needed to scale, and less of it when a site adjusts production. No matter the case, high latency because of enterprise data demand complicates data collection by plant staff.

Whether internal or third parties eventually consume that data, the initial movement of OT data to the enterprise for consumption is slow-going and expensive.

And because it is imperative to make data-backed decisions, power users of on-premise automation and data historian systems no longer have a monopoly on asset-produced information. Real-time and historical data streaming from smart sensors into collection systems are of greater importance to various business functions. They all need context-rich OT data to make quicker, smarter business decisions and the cloud to provide the scalability of a data lake to do so.

With company-wide use of on-premise collection systems, organizations have encountered challenges in high data demand and cyber-risk. Multi-vendor access to critical plant systems leaves SCADA and on-premise systems struggling to securely process and store data, keeping IT in security limbo. It is a leading reason why a majority of industrial control systems experience a breach annually.

With the rise of cloud computing and industrial connectivity, a new solution has emerged to operationalize real-time, historical, and metadata in the cloud. Fusion leverages the advances in cybersecurity, scale, and interoperability of IT infrastructure to distribute access to OT data and analytics for high-value industrial intelligence across the enterprise—no limits by tags or number of users.

Share this Article
Other Articles
Phillips 66 Accelerates Digital Transformation with Fusion Data Hub

Background Like so many enterprises, Phillips 66 collects deep, rich data from their operational technology systems. The problem is it was trapped in site-specific on-premise historians. Moving it to the cloud would unlock a lot of new insights and operational efficiencies. While the mission

Enerplus uses insights from Fusion for data-driven decision-making

Fusion equips Enerplus to access data for greater insights and improved decision making. An added benefit is the fully integrated security with the Microsoft Azure platform. Background Enerplus Corporation is one of Canada’s largest independent oil and gas producers. The company holds oil and

Davey Textile bolsters loom productivity with downtime monitoring

Fusion Data Hub enables Davey Textile to make data-driven decisions and catch lapses in loom activity, increasing productivity. Background Davey Textile has been offering textile solutions since 1986 through their manufacturing facility and warehouse located in Edmonton, Alberta, as well as a distribution warehouse