Anto Budiardjo, CEO of Padi, and Doug Migliori, Global Field CTO of Cloud Blue, describe the digital twin system interoperability approach taken by the Digital Twin Consortium. The presenters are the Chairs of the DTC Interoperability Tiger Team.
Anto Budiardjo, CEO of Padi, and Doug Migliori, Global Field CTO of Cloud Blue, describe the digital twin system interoperability approach taken by the Digital Twin Consortium. The presenters are the Chairs of the DTC Interoperability Tiger Team.
By Anto Budiardjo - CEO, Padi, Inc. (left) and Doug Migliori - Global Field CTO, CloudBlue (right)
Interoperability is a core concept of computer systems and networks, denoting the ability to discover, connect, and interact with other entities within an application’s broader context. In today’s distributed computing paradigm, efficiently achieving interoperability at all levels of the technology stack is paramount to deriving the most benefit from a system of systems.
For decades, interoperability has focused on making discrete components work in conjunction with one another. The Internet is perhaps the best example of billions of devices interoperating at technical and syntactic levels in a truly distributed fashion. At a smaller scale, the dynamic discoverability and capabilities matching of a simple USB is another example of the value created through a common interoperability mechanism. The ability to instantly use a device connected via USB with our laptops is an impressive feat of technology that we frequently take for granted.
As we discover new applications of digital twin systems for the betterment of business and society, we become increasingly aware of the importance of interoperability. Ensuring that these systems' discrete components and the broader system of systems are interoperable is essential to unlocking their potential with less implementation cost, less risk of failure, and less complexity at scale. In many ways, we strive to create a framework that would enable USB-type compatibility and ease for all systems connected to the Internet and private networks. Developing such a framework is daunting as most systems perform specific tasks and do not inherently interoperate with outside entities. System integrators typically handle such tasks.
The labor-intensive work performed by the $400B+ global system integration industry is often unnecessary. We argue that we may ease this burden by designing systems around a common framework and utilizing common mechanism(s) to interoperate like USB devices. Doing so empowers those working in system integration to maximize their efforts' value, designing applications that perform as intended rather than through point-to-point integrations.
The Digital Twin System Interoperability Framework white paper provides the framework for such activity, delivering on the authors’ aim to characterize the multiple facets of system interoperability. Our descriptions have been distilled into seven key concepts framing the design considerations necessary to make systems interoperate at scale. While the authors may not have contemplated all permutations of system interoperability, evaluating a digital twin perspective within the Digital Twin Consortium has provided the breadth and depth of scope necessary to address this paper’s objectives.
We have created a framework capable of unlocking significant value in complex distributed computing systems such as digital twins. As we invite you to review, challenge, refine, and adopt this framework, we hope it proves helpful in designing computing systems that improve our lives.
For more information about the framework, download it here or watch the webinar on-demand: How System Interoperability Empowers Digital Twins.
In this short video, the Co-Chairs of the Digital Twin Consortium's Infrastructure Working Group explain the design and intent of the recently published first draft of their Infrastructure Maturity Model:
The term smart building has always seemed a little odd to me. How can “a structure with a roof and walls,” the dictionary definition of a building made of steel and concrete, be smart? Surely, the word brainless, an antonym for smart, would be a better word to describe such structures!
In 1883, Warren Johnson, a Wisconsin teacher and later founder of Johnson Controls, patented an invention where a thermostat rang a bell when the temperature in a building was low, informing janitors to shovel coal into a furnace to increase the building temperature. A hundred years later, electronics took the reins with the introduction of Direct Digital Controls (DDCs). More recently, the mighty microprocessor and digital networking have taken charge in regard to the control of comfort in buildings.
The reality is that smart buildings have been focused on the comfort, safety, and after COVID-19, the health of the humans in buildings, not the building itself. While this may sound innocuous, the idea explains why the comfort of buildings has been the task to automate and control. Thus, the term smart is typically used to mean a more intelligent way to automate the control systems for the benefit of humans. This is akin to treating the symptoms of an illness rather than focus on its root cause.
What has been left to the side is the act of making the building itself truly smart across the entirety of its lifecycle, from design, construction, operation, maintenance, and occupation. For years, many in the smart building industry have been trying to address this challenge on the premise that if the building itself is smart, meaning that a well-organized and consistent set of information exists about the building which is cognizant of its design intent, has a rich history of why it was built, and a history of operational data, then with modern digital technology, buildings can truly be smart in order to perform more efficiently and adapt to unanticipated circumstances. This is far beyond just keeping the environment inside the building comfortable.
With the plethora of technology available, why has the field of smart buildings not broadened to encompass the full lifecycle mentioned above? Why has building automation and control remained mostly a comfort management system much more aligned with the boiler room than the board room’s business need to facilitate commerce?
To uncover the root cause of these questions, we need to know that, while a building is physically one of the most permanent entities in our daily lives, the people, industries, organizations, and economic drivers for a building are anything but harmonious. The responsibility of a specific building is handed from one entity to another throughout its life.
A building’s life typically begins as a vacant lot, owned by an individual or organization and governed by the relevant jurisdiction with its objectives and legal constraints. As the owner makes a decision to turn the lot into a building, they have financial objectives, either for their own use or in hopes that they can lease or sell the building once it is built. The design phase of a building involves architects and engineers that have a specific task to design the building. Once complete, the design is handed to contractors who start to build the building according to the design and economic specifications. Due to the fact that buildings are complex structures, there are dozens of different trades involved, each with their own concerns. Once built, the building may be sold, change hands, leased, or otherwise occupied by an entity who has their own requirements and objectives. Often this occupation becomes a cycle with refurbishments, during which time the building is operated and maintained by several organizations one after another, each with their individual concerns, and this process can go on for decades!
Because buildings are inherently large, each of the phases above involves significant amounts of money, expectations, risk, and returns. In the information era, each phase also involves significant amounts of data, including financial, engineering, and operational. Unfortunately, such information about the building during each phase is not retained, nor used, by anyone other than those directly involved in the phase of the building.
When an entity buys or leases a property, the essential transaction is when they “get the keys to the door.” There is typically no data being handed over regarding the information of the design, engineering drawings, construction, and history of past performance. Instead, the new owner is given the keys so that they can enter and use the building, which starts another isolated phase.
You can compare this process to buying a business, for example, a retail establishment. In this case, the buyer is likely to be given a book of accounts, spreadsheet(s) of budgets, lists of customers and vendors, personnel performance, and much more. Without such data, it would be hard for the buyer to actualize the expected return from their acquisition.
There is very little, if any, motivation for all stakeholders involved with the phases of a building’s lifetime to volunteer information that would be useful for the follow-on phase of the building. This is because there is no expectation to provide information and no agreed-upon mechanism to do so. So, each phase starts and ends with a transaction of handing over the keys.
Digital Twin is the first, and only, mechanism that has been created that could change these transactions as well as stakeholder behavior.
As industries start to view the Digital Twin as a consistent category of information equivalent to what a book of accounts is to a business, it will become the natural way for professionals to manage the phase of a building they are involved in. The Digital Twin will also allow professionals to make building information available to others, such as the acquirer or lessee of the building.
Similar to a spreadsheet, a Digital Twin of physical assets allow users to see the information about the asset. Digital Twins also give the ability to model future behavior and anticipate how a building will react to changes; with such predictions being based on design intent and past history of gathered information. Furthermore, with AI, this modeling can happen continuously, thus making the modeling and actions autonomous in its behavior.
To achieve this automation, information in Digital Twins has to be present with the building from the initial intent of the building, through design, construction, commissioning, and multiple cycles of occupation. Digital Twin information must also be in a form that can be reused for different and unexpected purposes. Lastly, the information contained in the Digital Twin must be easily accessible and compatible between systems. The implementation of a Digital Twin will also need stringent controls in relation to access, privacy, and security that are critical to information flow of the building.
For buildings, the Digital Twin is a new way to look at a building by focusing on the information associated throughout its lifecycle. If the concept of a Digital Twin is understood and widely adopted, will we actually have buildings that can be defined as smart.
By John Turner, VP of Innovative Solutions, Gafcon
You are a building or infrastructure owner. You have made the decision. You have decided that a Digital Twin would be beneficial in your business. How do you now implement successfully? How do you balance the undoubted benefits against the cost of implementation? To start, lets consider the definition of a Digital Twin that has been adopted by Digital Twin Consortium:
A digital twin is a virtual representation of real-world entities and processes, synchronized at a specified frequency and fidelity.
Unless the digital twin you have chosen to implement is totally self-contained, and that would be unusual in the world of buildings and infrastructure, it will have to:
With due consideration of the above definition of a digital twin there are therefore key elements of the Digital Twin that are enabled externally. It is likely that your organization will have multiple digital twins or an evolving digital twin that will eventually stabilize into an operational model. These may be provided by different and competing software vendors. The question is how to link the data flows between the different digital twins into a holistic and beneficial system for an owner. Your environment may look something like the figure below.
Digital Twin Consortium defines the digital thread as a mechanism for correlating information across multiple dimensions of the virtual representation, where the dimensions include (but are not limited to) time or lifecycle stage (including design intent), kind-of-model, and configuration history; the mechanism generally relies on stable, consistent real-world identifiers. Why should you be so interested in it?
The business value will be derived from the use cases contained within and supported by the digital twin. However, that business value will only be enabled in a cost-effective manner if the digital twins can communicate via the digital thread rather than having to be manually populated. It has been estimated that the cost of manually populating an IoT digital twin can be as high as 1% of the cost of construction. Anything that can reduce that cost would be beneficial. So how can this be done?
It is suggested that the following five steps can increase the value of your digital twin by containing the cost of implementation.
Digital twins have the power to transform an organization through the insights and use cases that they deliver. Do not underestimate the power of the digital thread to also deliver transformational and positive improvements.
Object Management Group (OMG) is an international, open membership, not-for-profit technology standards consortium.