The Rise of the Digital Twin in the AEC Industry
Ali Nicholl, PBC Today, May 7 2019
Originally published on PBC Today
Digital continues to disrupt and excite the construction industry in equal measure. The cavalcade of terms and technologies - Big Data, data warehousing, ubiquitous connectivity, artificial intelligence, augmented reality, and so on (and on) - continues unabated, promising increased efficiencies, greater insights and improved service delivery. However, the value of this technology remains only a fraction of what it could be.
As Stefan Webb from Future Cities Catapult (now part of the Connect Places Catapult) highlighted in his article for PBC Today, back in 2017, data, especially within cities and infrastructure, remains a largely untapped resource.
The greatest barrier to liberating value is that data and controls are traditionally locked within platforms, systems, and silos. New technologies promising greater flexibility, agility, and transformation, are too often isolated from incumbent platforms, systems, and databases. The existing silos of data and controls are unmanageable. Legacy systems, which should be a competitive advantage, exist in parallel to the new developments rather than informing and complementing them. Traditional IT solutions that present single-vendor solutions are seen as inflexible and expensive. According to a recent survey by the SAP User Group, DSAG, 62% of companies described their progress on Digital Transformation Projects, seen as critical to business, as 'Not Far' due to ever expanding scope and cost.
Are Digital Twins a Solution?
As frustrations with the challenge of siloed data and hidden value continues, we have seen the launch of a number of initiatives to take the systemic barriers to further development.
The 2017 National Infrastructure Commission (NIC) report, 'New Technology Study: Data for the Public Good' looked at the opportunities that innovation enables and how open data sharing can support those opportunities. It made the recommendation that there should be a creation of "a digital twin of Britain's infrastructure". An outcome of that recommendation has been the creation, by the Centre for Digital Built Britain and their partners, of a set of principles behind creating an ecosystem of connected Digital Twins, the Gemini Principles.
Elsewhere we have seen Greater Cambridgeshire Partnership and Telensa announce the creation of a city-wide digital twin, with the aim of designing better city infrastructure, delivering more efficient city services.
Digital Twins individually and federated together are increasingly being seen as a solution to the need for interoperability between data sets, silos and services, but there remains a lot of confusion as to what a Digital Twin is and, as importantly, what it isn't!
What is a Digital Twin?
A significant element of the ambiguity around Digital Twins and their value is owing to the emphasis on visualisation rather than virtualisation. In many sectors, Digital Twins have come to be seen as 3D Models, reporting tools and, at their best, constructs for simulation and emulation. They tend to be representations of a specific geography, asset, system, or process: distinct and separate. While many are static, little more than CAD/CAM models, increasingly there have been improved virtualisation visualisations, which use live data alongside historical interoperating platforms and sources. Once again in Cambridge, we've seen innovative mapping Start-Up, SenSat, demonstrate, in collaboration with Mott MacDonald and Safehouse Sensors, what can be achieved with new technology to map, visualise, and interrelate multiple data sources and sets.
We can however go further. At Iotic, we define a Twin as a comprehensive, interoperable version of anything, all its data, all its controls, through its whole life. The addition of controls, as well as data, is vital. The great power of Digital Twins comes not from what they can show us, but how they can securely and meaningfully interact with each other. The ecosystem of connected Digital Twins highlighted in the Gemini Principles or, as I presented at a TechUK event on this subject earlier in the year, can create not a National Digital Twin, but a nation of digital twins.
A real twin is one that can interact, interrelate and behave in a digital environment as its twinned counterpart does in the real world. It is a semantically defined virtual counterpart to anything which can securely interact and be federated together across organisations and supply chains, enabling a single source of truth (we aren't copying data into platforms, or creating new data lakes), enhanced monitoring, prognostics, new services and solutions that would not otherwise be possible.
By creating Digital Twins of anything and everything - every source and consumer of data and controls - what we are really doing is creating a machine-readable world: a world that can harness the power of Machine Learning, Artificial Intelligence, algorithms, business process rules, and security and access control profiles and platforms.
These twins provide security to our vital assets and infrastructure not only through the insight information and management they enable but through their very existence. Twins are by their very definition, not the thing itself. This abstraction from the real to the virtual digital space, enables interoperability between twins, brokered interactions between sources and consumers of data across companies, buildings, supply chains, and the wider world without the need to provide direct access to any of the 'real' asset platforms, processes or systems.
There remains of course a role for representations and renders of interoperable twins. 3D models can be powerful tools presenting insights and helping partners to see opportunities to collaborate, highlight bottlenecks and enable others to visualise assets, systems, and whole processes powerfully.
But they are just one manifestation of a twin: a view of the data and controls that a twin has. In our experience, different users, organisations within a supply chain and stakeholders can, and will increasingly, have their own visualisation, dashboards, and models of the same twins, interacting with what is important to them and interoperating with the other twins in their world. A Site Manager may want a twin of an individual construction site, showing live deliveries. Meanwhile, an Area Manager may want to access the performance of equipment across sites, making his or her twins the individual pieces of equipment.
From Disruptor to Disrupted
As organisations start to adopt real Digital Twins, they are liberated from viewing data and controls based on technology, which is traditionally a view of the world determined by an organisation's internal processes, databases, tech providers and silos of information. Instead, they can take the perspective of a customer's view of the world, federating twins around what matters to them - their site, building, office, city, network, service agreement, or entire organisation.
Redefining around what matters to you, or more pertinently to customers or stakeholders, additionally creates the flexibility to adopt new technologies and approaches while maintaining downstream services, leveraging existing process, and benefitting from technologies that have previously been confined to proof of concept or innovation activities.
Adoption and transformation is coming: BAM Nuttall the civil engineering contractor, has been working with Iotic to start creating twins of its assets, supply chain, and partner information by interrelating point solutions, third-party information and innovation programs and develop an interoperable ecosystem that delivers benefits within projects, across organisations and to their partners and stakeholders.
The power of Digital Twins is in their ability to interact meaningfully with each other and to create ecosystems where data and controls from anything, owned by anyone, can safely interoperate. The ecosystems will grow and evolve over time, powering multiple use cases, solutions and services, and working across technology providers and legacy systems. With real Digital Twins, we can start anywhere and grow exponentially, flexibly evolving the federated and composite twins of what we need now and what we will need in the future, adapting to new technology and changing requirements, and delivering outcomes we cannot yet even envisage.
Davey Winder, The Times. December, 19 2018
Iotic helps make dumb machines smart by creating intelligent digital twins of connected IoT devices and the wider data estate using a cloud-hosted middleware space. Last year, Gartner heralded digital twins as a top-ten strategic technology trend. What they deliver is something often regarded as impossible: IoTsecurity coupled with open interoperability. Imagine different platforms, services, networks and devices securely interrelating with public and private third-party sources. Robin Brattel, Iotic’s chief executive, explains that this patented technology “enables secure programmatic interoperability of data and controls for interactions across organisations, supply chains and silos”.
Unsurprisingly, it is garnering support in the high value manufacturing and construction sectors.
“It is the digital twins that interrelate, with actual devices, data sources and equipment never exposed,” says Mr Brattel. “These interactions are securely brokered with granular access control; the source or control is always in charge adaptively choosing when and to whom they are visible.” What this means is that by using an intelligent abstraction layer, Iotic can overcome the well-documented challenges of IoT security that have led to the creation of data siloes and vertical technology stacks that previously limited return on investment.
“Our technology is being adopted by market-leading global enterprises to achieve the impossible,” Mr Brattel concludes. These abstracted digital twins become a single source of truth, enabling solutions from simulation models to reality and minimum viable product to scale.
Data, Data Everywhere
The data-powered IoT market in 2020 is estimated at anywhere in the $300 billion (Gartner) to $7.1 trillion (IDC) range with a substantial rise in the storage and analysis of data held in Big Data silos to exceed 40 zetabytes by 2020 (GP Bullhound). In reality the market is an order of magnitude greater as these estimates do not include derived things. Connecting what we encounter daily further multiplies this figure many fold, before even greater scaling driven by derived data addressing business, government and societal needs.
Because no business is an island the narrative needs shifting from big data storage, and associated communication networks, to collaborative use and reuse of information internally and externally. The ability to share and interrelate data and things is at the heart of the Internet of Things. When things connect to share data and silos are opened, we can create communities of unrelated things that can interact. Generate solutions and products with automated genres of service, operations and logistics. Disrupt business models and transform consumer expectations.
When can we start?
We don’t need to agree to one platform in order to progress. We don’t need to wait for 5G, bigger databases, new sensors, or better analytical techniques. The traditional structured inter-relationships between citizens, SMEs, enterprises, infrastructures, utilities, healthcare, transportation, cities, regions and governments are increasingly at risk of becoming redundant, we need to act.
There is no tipping point. We can today improve efficiency, effectiveness, and quality of life through greater collaboration, sharing, and innovation.
What are the emergent trends in data sharing?
The data and information that we need already exists, with more being created all the time. That information has a value currently, but increasingly we will see value creation through its reuse and recycling. A simple example is a temperature gauge. The data from your temperature gauge is only of value to you when you look at it, but what if that information, and its accuracy was continuously available and shared. How could servicing, repairs, and reliability be improved?
Upcycling in the IoT is an additive process. Combining things in previously unimagined ways will generate benefits for the data owners, end users, and the innovators who create the transformative relationships.
Interactions not interfaces
As the number of connect things, their data – direct and derived - rapidly outstrips the global population it is clear that interactions between things can not rely on individuals and interfaces.
Programmatic creation of relationships between data, things, systems learning, artificial intelligence, inspired industrial design and a host of additional tools, will be needed to bring us closer to David Rose’s vision of “technology that atomizes, combining itself with the objects that make up the very fabric of daily living.”
Communities not consumers
As enterprises realise that the value of data comes not from hoarding, but from sharing, we will see this reflected in innovative services. Users will enhance their experience by sharing information with self-selected communities - just like the internet. Allowing data- and information-rich communities to self organise and propagate.
Flexibility and expandability will become a core feature of successful technology. Expectations are changing, solutions will fail where overly proscriptive uses and interactions prevent users from curating their own experiences and sharing that curation with others, who can, in turn, expand, modify, combine and personalise their experience.
Data sharing enables collaborative, creative communities of individuals and organisations to not do things differently but do different things. We have the technology now to break down our data silos and work together to develop new business and creative models, service lines, delivery methodologies, and transform our experiences. Why not make yours an enchanted life?
NOTE: A VERSION OF THIS BLOG WAS ORIGINAL PUBLISHED ON TECHUK'S WEBSITE (www.techuk.org) AS PART OF DATA DRIVEN ECONOMY WEEK.
Creating a model of the world
The first Emperor of China, Qin Shi Huang, (pictured right) ordered an elaborate burial chamber constructed on a scale almost unimaginable today. Around 700,000 men spent almost four decades building life-size models of warriors, farmers, officials, carts, horses, roads, farms, towers and palaces, as well as mercury simulations of the Yangtze and Yellow Rivers. These flowed mechanically to a mercury sea under a vaulted ceiling resplendent with depictions of heavenly constellations. The Emperor believed that if his model of the world were accurate enough, he would be able to continue to rule the empire in the afterlife. Understanding and seeing all as it happened in the real world, he would have omnipotence over every corner of the empire and realise his claim on universal and eternal rule as the “First August Thearch” (or god-ruler). His dream was one of total and enduring control through a complete understanding of everyone and everything within the empire.
Many Emperors shared the same dream of immortality through the construction of an enduring dynamic model of the world that was interchangeable with the real world: a facsimile so precise that to have mastery of the model was to have dominion over the world.
World building today
Transporting these emperors to the present, they would surely see progress in their dream of modelling the world, but have we managed to achieve the ultimate objective of creating a true facsimile of the world as it is? Some might say that Big Data has made available so much data that we must now be able to know the world, but the aim of Big Data is primarily to mine history, not to reveal the now. Big Data doesn’t provide dynamic behavioral responses. The Emperors’ desire to model the world as it is happening is proving more intractable.
The contemporary analogues of Emperors are decision-makers in business. They want to know what assets their companies possess - where assets may well be digital as well as physical, now that sources of data are of at least equal value as real things. They want to be able to see the state of things as they are so they can make decisions. They want a single source of truth. They want to run “what-if…?” scenarios to see which decisions make the biggest positive or the smallest negative impact. In short, they need a dynamic model of the real world. The contemporary analogues of the Emperor’s 700,000 army of builders and scores of overseers and bureaucrats are the data scientists, analysts and apps developers who must build these dynamic models.
Could “Digital Twins” be the solution to ancient Emperors’ and their modern-day counterparts’ dreams? Digital Twin is a buzz-phrase that is currently climbing the hype cycle, but what actually is a true Digital Twin? Many people apply the phrase to CAD/CAM 3D models of buildings, engines, components, etc., but these are static models of things that don’t have any dynamic data and don’t exhibit the behavior of the thing they claim to model. The model of the building doesn’t show how the building interacts with people waiting to use the lifts; the model of the engine doesn’t show how it works when some components are worn. These models don’t meaningfully interact with the world or each other.
Hacking is a profession; the people attacking our systems from cyberspace are doing it for a living. Their return on investment is high and their risk of being caught is low. This makes them highly motivated. They are well educated and understand security technology; even the least capable adversary can cause significant damage to a business and its reputation owing to the high availability of hacking tools.
Meanwhile, the significant business opportunity represented by the IoT has the potential to transform business models by moving from one-time product transactions to ongoing product-as-a-service relationships, optimise utilisation of physical and financial assets, and to create new forms of customer engagement. So as businesses increase their focus on the potential of IoT, the need for knowledgeable C-Level executives to navigate the emergent sector securely and confidently will lead to the creation of a Chief IoT Officer at board level (Machine Research).
Security concerns have played a role in tempering enthusiasm for the long-hyped opportunity of IoT. In 2015, Forrestor predicted that 82% of businesses would be using IoT applications in 2017. By Q1 2017, those figures have been adjusted to reflect that as IoT is a “business-led trend,” only 23% of enterprises use the IoT, with another 29% planning to do so within 12 months. The concern is well founded: the IDC reported that by 2018, 66% of networks will have an IoT security breach and by 2020, 10% of all attacks will target IoT systems. It is already happening in M2M, with the 2013 hack in the US of Target’s heating, ventilation and air condition systems in its stores leading to the theft of 40 million credit card numbers.
If we can meet the security challenges that threaten the IoT, then we can unlock potential for businesses to take advantage and emerge as significant players in the 21st century.
Securing connected devices
The RSA conference of 2016 was blunt: “IoT will crash and burn if Security doesn’t come first”; and, because of the legacy of businesses’ data and digital landscapes, attack surfaces are vast.
As Capgemini’s Digital Transformation team noted, most objects or systems that are now connected to the Internet were not designed to be secured in a connected environment. Meanwhile, the products which were designed to be connected have been reasonably secured for that purpose or use. However, and as an example, applications connected through wireless plug-in devices – which may not even require authentication – have led to devices become vulnerable to cyber threats and, in particular, Distributed Denial of Service (DDoS) attacks, such as the 2016 DDoS attack on Dyn, the Internet infrastructure company, led to massive disruption to service across Europe and the US with report that the attack was part of a genre aimed at IoT devices using the Mirai botnet. These attacks are at the forefront of security concerns for the emergent IoT and form one of the reasons why Iotic Space is built the way it is to rebuff this threat.
DDoS attacks are not possible in Iotic Space. As outlined above, these attacks are targeted at things that are on the public Internet, often a business’s first brush with the IoT as projects and functions look to exploit ever-expanding internet-enabled products and consumables. The connected things are often attacked using “swamping” and password dictionary attacks.
Because of the double virtualisation built into Iotic Space and the “ioticising” that occurs when a real thing (sensor or device) becomes an Iotic Thing, there is no analogue for this kind of attack within Iotic Space. The governance layers within Iotic Space formed by the Registrar overseeing its Iotic Containers, communicate over HTTPS and have an introductory, brokered public/private key interchange before any communication can occur. Consequently, any DDOS attack is blocked at the Transport Layer.
Any Thing must come into Iotic Space through the agent API. This only communicates with an AMQP broker, not the Container itself. In order to talk to the AMQP broker, the Thing seeking a connection has to present a user ID and password which are assigned to it by the system.
We have built throttling protection into AMQP, so that even if a DDOS-style attack was not blocked at the Transport Layer (which it would be), the mass outages and denial of service described in the article are still prevented.
Any message from a Thing, a Container or the Registrar is cryptographically signed with a token (so can be traced). Each message has a sequence number to prevent replay attacks. All messages are encrypted using TLS 1.2.
Apart from the AMQP brokers, there are publicly available sites that form part of the extended Iotic Space system; namely the Container and Registrar web applications. Both sites are WordPress sites with added security plugins. They have no relation to any Iotic or real Thing, do not store any of the security information above, and only store users’ names and email addresses. The storage of any users’ passwords is encrypted.
DDoS attacks on connected devices are of course not the only threat. Capgemini advised in 2016 that greater collaboration across integrated teams of competencies to model future IoT services and processes was vital. But, how can we model for a future that has not yet arrived?
IoT standards remain, at best forecast, over the horizon (see Figure 1) with the adolescence of the IoT highlighted in Forrestor’s TechRadar™: Internet Of Things Security, Q1 2017 report, which flagged that Internet of Things (IoT) technologies are immature.
Most of the 19 IoT technologies identified in the report are in “Survival and Growth phase today”. Standards long identified as the security solution are “nascent, as vendors are only a couple of years into the process of creating general-purpose interoperability standards.”
But we don’t need to wait for standards.
The virtualisation and the use of digital twins and digital avatars will continue to proliferate (Machina Research). However, to increase their usefulness we need to blend both the real and the virtual abstraction to allow for experimentation, exploration and investigation. “Just start” modelling means that Iotic Space works without the need for “baselining”, and can be made available to your data and digital landscape quickly and efficiently. Even complex environments can be modelled, tested and protected in timescales measured in days not months.
On Iotic Space, digital twins of systems and processes can be created, with “real” systems virtualised to sit alongside and be substituted in and out for their digital twins with no costly or time-consuming redesign. Attack vectors can be analysed on virtualised architectures and counter measures, specialist software and analyses tested and refined.
In Wind River’s much touted briefing from 2015 “Security in the Internet of Things”, challenges were identified around secure booting, access control, device authentication, firewalling and IPS, and updates and patches, all of which are valid challenges. However, the solution is not, as identified, an end-to-end Operating System with security built in, but a true Internet of Things without command and control hierarchies that lead to systemic weaknesses.
Decentralising through an abstraction layer as advocated by Iotic Labs enables scalable solutions to the constantly changing complexity of the IT environment in the IoT age. As Geoff Web outlined in his article “Adapting Security to the Internet of Things” a disaggregated approach allows for the “Las Vegas” model of security, which focuses on tracking specific signals that indicate suspicious behaviour at an individual actor or thing level, rather than by trying to keep track of an entire ecosystem. Metadata allows us to record, understand and infer information about the identity of a device, thing or feed connected to Iotic Space; and our understanding is based upon ontologies, behaviours, catalogues and semantic context of how it should behave. We can therefore set simple triggers for warning and response if something unusual or suspicious occurs. This ability to track and identify helps security professionals keep the IoT secure and, in turn, enable secure interaction with products and services.
Use the best
As the IoT landscape evolves and changes, and as business needs, foci and priorities shift, it is vital that any IoT security can adapt to meet the challenges of an as-yet unwritten future. To meet these challenges, organisations will need to be able to select best-in-class products, services, systems, platforms, and analytical and AI tools, combining them to match best-practice principles and push for competitive advantage.
Iotic Space is an environment enabling interoperability between anything and any Thing. The double virtualisation and abstraction that enables any Thing to interact with any other Thing, subject to correct access controls and brokerage, allows IT professionals and system architects to evolve their digital and data landscapes iteratively over time, and without being locked to a single service provider or technology.
Capital expenditure can be mediated through the creation of hybrid systems that are updated only as needed and in line with business objectives. In an emerging world of:
Currently the fractured nature of the landscape is inhibiting adoption and, in turn, limiting the ability of security and IT professional to develop and implement robust defensive positions and capabilities.
Toptal identified five elements that need to be borne in mind when considering IoT security:
The use of controls as well as feeds enables firmware and software patches and updates to be controlled and managed. Meanwhile flexible interactions enable systems architects to adapt and update systems and processes within the environment with best-in-class analytics, software, hardware and platforms.
It is core to the environment that both the Provider and the Consumer in a data-sharing relationship are known entities within Iotic Space. This enhances trust and security, as only account holders with known credentials can access data.
No data is stored in Iotic Space – this is the antithesis of a Big Data solution – and a less attractive proposition for malicious targeting. Metadata and Data are separated. This enhances security, as the data stream is meaningless without knowledge of the metadata.
8. https://www.theguardian.com/technology/2016/oct/21/ddos-attack-dyn-internet-denial-service / https://www.wired.com/2016/10/internet-outage-ddos-dns-dyn/