Angelo Hulshout has the ambition to bring the benefits of production agility to the market and set up a new business around that. How to get the necessary input data to meet the needs of the factory management?
In 1798, Samuel Taylor Coleridge wrote “The rime of the ancient mariner.” It’s a story about a mariner stuck alone on a ship, doomed after he killed an albatross. At some point, he gets thirsty, and while he’s surrounded by water, since the sea is salt, none of it’s drinkable.
In making production facilities better, and more intelligent, we face a similar challenge. Production lines, and the software and processes controlling them, are a potential ocean of data. Data that we can use to open up the path to improvement. However, while it’s certainly possible to obtain all that data and store it in a digital environment, having the ocean at our hands doesn’t necessarily make it useful.
Luckily, we don’t need to drink the whole ocean – provided we know what we’re looking for. If we know what the factory management is after, we can derive the information they need.
To identify the sources of data we’re going to work with, we need to have an understanding of how a factory is set up: we need to know the specifics of the production lines, the warehouses, the processes around manufacturing and logistics, and the software used to support all of it. This gives a mix of places where we may find the data we need – as determined by the goals we’ve set. Are we going to look at production line efficiency, factory bottlenecks or perhaps energy consumption?
Depending on the needs, we use different data sources. For planning and tracking purposes, it makes sense to use reporting data that can be obtained from MES and ERP systems, for example; for information about the production line, we can use data coming from controllers (PLCs) and the manufacturing equipment itself.
Getting the data out of these systems involves multiple technical and (partly) non-technical solutions. Standardization in this area isn’t in place yet, although existing industry communication standards like OPC-UA and MQTT help, as do (somewhat older) production models like S95 and S88. For the data collection itself, the white paper “Big data challenges in smart manufacturing industry,” by the Big Data Value Association (BDV), discusses some standardizations that are pending or being started.
To explain what we can do, I rely on a simple factory layering model. It’s not fundamentally different from the reference architecture presented in the BDV white paper, just a bit more simplified.
At the lowest level, we have the actual production equipment. If the equipment has built-in controllers, it can often be connected to a network to allow control and data extraction. Equipment without a built-in controller is often, but not always, controlled through external controllers or PLCs, which allow us to obtain data through their I/O interfaces. At the MES and ERP layers, where the production planning and logistics take place, we can use existing reporting interfaces.
Thus, data will find its way to the data and analytics layer we put on top of the factory. This layer may cover data analysis for one factory or multiple and can be either local or cloud based. It’s home to features like dashboards, digital twins (which can be fed with real production data to perform replays) and optionally also machine learning algorithms that can propose changes to the production process.
Standardization or not, in an existing factory, the data will often have to be obtained in a factory-specific way. For a lot of manufacturing SMEs, the fully ‘connected factory’ envisioned when talking about smart industry is non-existent. The equipment used in production is often not of the ‘connectable’ type and sometimes doesn’t even have the possibility of digital control.
So, although there are mechanisms for obtaining data, we usually need to go a step deeper to get it all out. This may involve making changes to the software running on embedded controllers or PLCs to make the data available. When this is not allowed, or not possible, we have to revert to other, non-intrusive solutions. We can think of having operators collect data, eg in a spreadsheet or on a notepad. This helps in collecting initial insights, but in the long run, it’s not a good approach to create a closed-loop for production improvement.
The most reliable way to implement non-intrusive data collection is through additional sensors. These are positioned in places where they can do their job without disturbing the production flow, they’re powered through a separate power network and they communicate over a dedicated communication network. This sounds expensive, but taking into account the risks of changing existing software and equipment, the cost is often acceptable.
All the collected data, by the sensors as well as from equipment directly and from PLCs and other controllers, is bundled by an industrial internet of things (IIoT) gateway and sent to the data storage we find in the data and analytics layer.
The BDV white paper focuses on the collection of data from a so-called big-data perspective: big amounts of data are gathered from many sources and complex algorithms are used to get information from that data. This approach is often used in market analysis, behavioral analysis and other areas where correlating different events may lead to new insights.
It can also be applied in manufacturing. For bigger factories with a higher level of digitalization, it may indeed be the best approach. But for the SMEs we work with, it may be a bit too much to collect data from every single source in a factory. Every sensor, I/O connection, plan or report could be logged for that purpose. This would lead to a lot of data, given that most sensors and I/O connections provide new values once per second or even more frequently, and not all of them will directly correspond to the questions of the factory management.
We can filter out what’s necessary, but I would advise a more selective data acquisition approach: look at what data can be obtained, especially at the lower levels, with the underlying questions in mind. This leads to less data being gathered, making analysis easier. The risk that a specific data item isn’t available can be mitigated by following the non-intrusive approach to data collection: it’s always possible to have a dedicated sensor installed to get the missing item or have an operator (temporarily) collect the information by hand.