by Dean Douglas | July 22, 2022 | 4 min read
When we talk about digital twins to organisations, especially when talking to those just setting out on their digital twin journey, we are often met with the same misconception: that they will need to stick a sensor of some sort to every asset and piece of equipment they have.
And it is important that we realise that this is not the case for a number of reasons. The first of which being that this already portrays digital twin as an expensive and laborious endeavour for organisations to undertake. As soon as lots of new hardware and the software to support them are mentioned you can already hear the groans of financial constraints. We do not need to be creating any additional challenges to the adoption of digital twin than there already are.
The second reason we should not think of digital twin development as the mass deployment of sensors is because it feeds into the rhetoric that we need to be collecting mountains of new data to develop and operate a digital twin – something that we have spoken about previously. But that is not the case! We already collect mountains of data about our built assets. However, right now we store that data in a variety of silos and realise a fraction of its potential ability to derive insights about our assets. It is with the development and integration of digital twins that we can begin to draw together these silos to derive the insights.
But perhaps the most important reason I want to highlight to you as to why this misconception is so challenging is because of how we have traditionally deployed sensors. Historically, what we have done with sensors is place a sensor on a piece of equipment, relay the data that comes from that sensor onto a dashboard – whether that be analogue or digital – where it is commonly represented as a single reading. This application of sensors relies heavily on the knowledge, experience and ability of the staff reading that data to then draw understanding from it and execute the necessary actions.
While there is no replacement for long held industry knowledge and expertise, we must understand how we begin to transfer at least a portion of that knowledge into our systems and our digital twins. Making our systems smarter by building in an understanding of the relationships between the sensors and data sources, and the readings they produce. Its through developing this understanding within our digital twins that we can create dashboards and controls that provide users with interfaces that validate data being received through multiple streams, provide multiple relevant readings that can be grouped together to assess a single issue, or see the consequences of a fault on other equipment.
By building these relationships into our systems, enabling them to better support our operations, this alleviates some of the pressures that the experts in our businesses face, allowing them to spend less time interpreting information and more time tackling the more challenging issues our assets face. Also, by making our systems smarter and more representative of the systems they portray, we begin to ease the initial burden of understanding required of new staff members being introduced to these systems.
Going forward on your journey towards the creation of a digital twin, it is important to remember that we should not seek to perpetuate the processes and systems of sensor deployment. We should strive to get more from each sensor by building the understanding of their relationship through the assimilation of long held industry knowledge and expertise into our systems.
If you are interested in hearing more about the digital twin research we are doing at BIM Academy, or if you want to be involved, get in touch: [email protected].
Member of the Ryder Alliance
+44 (0) 191 269 5444 [email protected]
Subscribe to our newsletter for our latest insights into all things digital.