If your OT data is bad, then your business data will be worse
Fujitsu / March 28, 2022
To meet new social demands, such as greater sustainability, manufacturers need better data about their operations, says Jouko Koskinen, Chief Technology Officer, Private Sector at Fujitsu Finland.However, for all the talk in recent years about IT/OT integration, there is still a long way to go before most factories can extract data from the shop floor that is good enough for real business insights.
Not as digital as we thought?
I visit a lot of factories. I recently went to one that was not the most modern but it had benefited from substantial automation investment. As is often the case, the factory managers and owners believed their technology conformed to modern best practices.
However, the digitalization capabilities in scenarios like this are often quite basic compared to what is now possible. In that particular factory, no automation processes were visible to cloud apps, nor was the valuable data that could potentially flow from them. That was just one of many lost opportunities. There was plenty of real-time data, but it was all blocked at the shop-floor level. As the data emerged, it was gone, replaced by new data. There was no means of storing, let alone leveraging a data history.
This wasn’t specific to that factory. It’s a situation I see again and again. Until very recently, factories were not typically built to handle real-time data, use case-driven data models and data warehousing, or be scalable and modular for agile operations. On top of that, we usually have challenges knowing precisely what assets are in the factory. Then there’s the lifecycle management of those assets, consolidation and harmonization of data involving multiple protocols, and transferring data from the edge to the cloud. And then there’s cyber security, which underpins just about everything.
Digitalization capabilities in most scenarios are quite basic
Digitalization always needs a business case
But I am getting ahead of myself. Fixing a digitalization deficit is not the right starting point.
My anecdote describes a factory with missing capabilities. But there’s no point undergoing digitalization just to score points in your peer group: there has to be a business case.
In the example above, management was aware it had an incomplete picture of factory operations. Other than slowing or stopping the production lines, there was no way of responding to situations emerging in real time – because the data describing the issue disappeared as soon as it appeared.
There was no way of interrogating historical data to identify positive trends to be reinforced or patterns that could predict challenges. Any KPIs set by management would always be open to debate because they were modeled on suspect data and tracked against equally questionable data. And there was no way of integrating the data, either into other management tools and applications or into a broader ecosystem where partners, suppliers, and customers could propose higher-value opportunities.
Production critical data itself cannot go to the cloud until the actions on the shop floor that depend on it have been completed. However, after that it is possible to harmonize shop floor data to become accessible to any application, then transfer it to the cloud for visualization and historical scrutiny. That opens up new possibilities, for example, a new range of business goals become possible – driving greater efficiency as part of sustainability goals.
Fixing a digitalization deficit is not the right starting point
Does your architecture support the business use case?
The business use case is always the starting point. If there is one that you believe justifies the investment, choosing software and cloud services is the easy part. With AWS, Azure, and open source libraries, plus applications from SAP, ServiceNow and Microsoft, it’s a case of “take your pick”.
But what’s often less clear is whether the existing enterprise architecture will support that use case. In Fujitsu’s opinion, everything starts at the factory floor. If your OT data is bad, then the business data in your IT systems will be even worse.
Many data issues might need resolving. Is shared data required and is it available or not? What integration or sensors are going to be critical? Is the existing technology base still supported? And can the data it generates be harmonized for cloud migration or machine learning models – to take just two possibilities?
It’s also vital to check if you have or can obtain the right resources — in the sense of capabilities and competencies to operate, support and maintain the new architecture.
However, the most challenging part of this equation, in my experience, is to get to a consensus that the status quo needs to change. Budget is a factor. Many people understandably prefer the approach of “if it isn’t broken, don’t fix it”. And so is culture. Factory OT and IT have operated in two very different worlds until now, with different skill sets, risk appetites, refresh cycles, even different generations.
That is also changing. For many reasons, the boundaries between OT and IT are dissolving. Convergence is driven by a long list of factors — heightened global competition, digitalization, disintermediation, generation change, mass-customization, to name a few. As a result, factories are opening up to the digital world.
The business use case is always the starting point
No cybersecurity, no digitalization
That comes with risks. Because all factories are based on ISA-95, and in the past we didn’t need to pay attention to cybersecurity or dataflows within the factory and between equipment, there is a significant security and data “debt”.
In terms of data flows, fixing the connectivity challenge is the low-hanging fruit. There are likely to be many data sources across multiple OT systems where the data is not harmonized. Typically, factories have several different automation line suppliers and they all have different protocols. The solution is to harmonize via the OPC UA (Open Platform Communications Unified Architecture). By doing this, it makes it possible to see, for example, valve parameter values in China or Finland in one and the same format, leading to more reliable process reporting and meaningful KPIs.
However, data connectivity raises cybersecurity issues, particularly to systems outside the factory. OT has traditionally been hermetically sealed-off from IT networks. Therefore, it was less exposed to the risk of cyberattacks such as ransomware. After several high-profile cases, such as Stuxnet and the Colonial Pipelines attack, the extent of that risk is now clear. Our approach at Fujitsu is to separate IT and OT in the cybersecurity context but combine them when it comes to dataflows.
The issue is real and must be addressed. If a bad actor gains access to your Industrial Control System (ICS) PERA level 1, they have an open door to all your equipment, and production can be stopped or slowed down. My colleague Graeme Wright has been blogging about OT security recently and goes into more depth about that here.
The bottom line is no cybersecurity means no digitalization. But 100% safety is a mirage. There is a trade-off between what digitalization enables and the risks it creates. This is one of the critical decisions factory management has to take in today’s world. Very few are deciding they have the option to remain isolated.
In my next blog, I’ll zoom in on what an enterprise architecture should look like in a modern factory environment, and how you can build one from where you are today rather than starting from scratch.
Bottom line is no cybersecurity means no digitalization
Jouko has consulted customers in Digital solution and transformation, Digital Factory, CRM, ERP and SCM consultancy. He is a trained business process consultant and SAFe Agile and his specialties in Smart Manufacturing include operational technology cyber security.