Data, Data Everywhere
The data-powered IoT market in 2020 is estimated at anywhere in the $300 billion (Gartner) to $7.1 trillion (IDC) range with a substantial rise in the storage and analysis of data held in Big Data silos to exceed 40 zetabytes by 2020 (GP Bullhound). In reality the market is an order of magnitude greater as these estimates do not include derived things. Connecting what we encounter daily further multiplies this figure many fold, before even greater scaling driven by derived data addressing business, government and societal needs.
Because no business is an island the narrative needs shifting from big data storage, and associated communication networks, to collaborative use and reuse of information internally and externally. The ability to share and interrelate data and things is at the heart of the Internet of Things. When things connect to share data and silos are opened, we can create communities of unrelated things that can interact. Generate solutions and products with automated genres of service, operations and logistics. Disrupt business models and transform consumer expectations.
When can we start?
We don’t need to agree to one platform in order to progress. We don’t need to wait for 5G, bigger databases, new sensors, or better analytical techniques. The traditional structured inter-relationships between citizens, SMEs, enterprises, infrastructures, utilities, healthcare, transportation, cities, regions and governments are increasingly at risk of becoming redundant, we need to act.
There is no tipping point. We can today improve efficiency, effectiveness, and quality of life through greater collaboration, sharing, and innovation.
What are the emergent trends in data sharing?
The data and information that we need already exists, with more being created all the time. That information has a value currently, but increasingly we will see value creation through its reuse and recycling. A simple example is a temperature gauge. The data from your temperature gauge is only of value to you when you look at it, but what if that information, and its accuracy was continuously available and shared. How could servicing, repairs, and reliability be improved?
Upcycling in the IoT is an additive process. Combining things in previously unimagined ways will generate benefits for the data owners, end users, and the innovators who create the transformative relationships.
Interactions not interfaces
As the number of connect things, their data – direct and derived - rapidly outstrips the global population it is clear that interactions between things can not rely on individuals and interfaces.
Programmatic creation of relationships between data, things, systems learning, artificial intelligence, inspired industrial design and a host of additional tools, will be needed to bring us closer to David Rose’s vision of “technology that atomizes, combining itself with the objects that make up the very fabric of daily living.”
Communities not consumers
As enterprises realise that the value of data comes not from hoarding, but from sharing, we will see this reflected in innovative services. Users will enhance their experience by sharing information with self-selected communities - just like the internet. Allowing data- and information-rich communities to self organise and propagate.
Flexibility and expandability will become a core feature of successful technology. Expectations are changing, solutions will fail where overly proscriptive uses and interactions prevent users from curating their own experiences and sharing that curation with others, who can, in turn, expand, modify, combine and personalise their experience.
Data sharing enables collaborative, creative communities of individuals and organisations to not do things differently but do different things. We have the technology now to break down our data silos and work together to develop new business and creative models, service lines, delivery methodologies, and transform our experiences. Why not make yours an enchanted life?
NOTE: A VERSION OF THIS BLOG WAS ORIGINAL PUBLISHED ON TECHUK'S WEBSITE (www.techuk.org) AS PART OF DATA DRIVEN ECONOMY WEEK.
Creating a model of the world
The first Emperor of China, Qin Shi Huang, (pictured right) ordered an elaborate burial chamber constructed on a scale almost unimaginable today. Around 700,000 men spent almost four decades building life-size models of warriors, farmers, officials, carts, horses, roads, farms, towers and palaces, as well as mercury simulations of the Yangtze and Yellow Rivers. These flowed mechanically to a mercury sea under a vaulted ceiling resplendent with depictions of heavenly constellations. The Emperor believed that if his model of the world were accurate enough, he would be able to continue to rule the empire in the afterlife. Understanding and seeing all as it happened in the real world, he would have omnipotence over every corner of the empire and realise his claim on universal and eternal rule as the “First August Thearch” (or god-ruler). His dream was one of total and enduring control through a complete understanding of everyone and everything within the empire.
Many Emperors shared the same dream of immortality through the construction of an enduring dynamic model of the world that was interchangeable with the real world: a facsimile so precise that to have mastery of the model was to have dominion over the world.
World building today
Transporting these emperors to the present, they would surely see progress in their dream of modelling the world, but have we managed to achieve the ultimate objective of creating a true facsimile of the world as it is? Some might say that Big Data has made available so much data that we must now be able to know the world, but the aim of Big Data is primarily to mine history, not to reveal the now. Big Data doesn’t provide dynamic behavioral responses. The Emperors’ desire to model the world as it is happening is proving more intractable.
The contemporary analogues of Emperors are decision-makers in business. They want to know what assets their companies possess - where assets may well be digital as well as physical, now that sources of data are of at least equal value as real things. They want to be able to see the state of things as they are so they can make decisions. They want a single source of truth. They want to run “what-if…?” scenarios to see which decisions make the biggest positive or the smallest negative impact. In short, they need a dynamic model of the real world. The contemporary analogues of the Emperor’s 700,000 army of builders and scores of overseers and bureaucrats are the data scientists, analysts and apps developers who must build these dynamic models.
Could “Digital Twins” be the solution to ancient Emperors’ and their modern-day counterparts’ dreams? Digital Twin is a buzz-phrase that is currently climbing the hype cycle, but what actually is a true Digital Twin? Many people apply the phrase to CAD/CAM 3D models of buildings, engines, components, etc., but these are static models of things that don’t have any dynamic data and don’t exhibit the behavior of the thing they claim to model. The model of the building doesn’t show how the building interacts with people waiting to use the lifts; the model of the engine doesn’t show how it works when some components are worn. These models don’t meaningfully interact with the world or each other.
Hacking is a profession; the people attacking our systems from cyberspace are doing it for a living. Their return on investment is high and their risk of being caught is low. This makes them highly motivated. They are well educated and understand security technology; even the least capable adversary can cause significant damage to a business and its reputation owing to the high availability of hacking tools.
Meanwhile, the significant business opportunity represented by the IoT has the potential to transform business models by moving from one-time product transactions to ongoing product-as-a-service relationships, optimise utilisation of physical and financial assets, and to create new forms of customer engagement. So as businesses increase their focus on the potential of IoT, the need for knowledgeable C-Level executives to navigate the emergent sector securely and confidently will lead to the creation of a Chief IoT Officer at board level (Machine Research).
Security concerns have played a role in tempering enthusiasm for the long-hyped opportunity of IoT. In 2015, Forrestor predicted that 82% of businesses would be using IoT applications in 2017. By Q1 2017, those figures have been adjusted to reflect that as IoT is a “business-led trend,” only 23% of enterprises use the IoT, with another 29% planning to do so within 12 months. The concern is well founded: the IDC reported that by 2018, 66% of networks will have an IoT security breach and by 2020, 10% of all attacks will target IoT systems. It is already happening in M2M, with the 2013 hack in the US of Target’s heating, ventilation and air condition systems in its stores leading to the theft of 40 million credit card numbers.
If we can meet the security challenges that threaten the IoT, then we can unlock potential for businesses to take advantage and emerge as significant players in the 21st century.
Securing connected devices
The RSA conference of 2016 was blunt: “IoT will crash and burn if Security doesn’t come first”; and, because of the legacy of businesses’ data and digital landscapes, attack surfaces are vast.
As Capgemini’s Digital Transformation team noted, most objects or systems that are now connected to the Internet were not designed to be secured in a connected environment. Meanwhile, the products which were designed to be connected have been reasonably secured for that purpose or use. However, and as an example, applications connected through wireless plug-in devices – which may not even require authentication – have led to devices become vulnerable to cyber threats and, in particular, Distributed Denial of Service (DDoS) attacks, such as the 2016 DDoS attack on Dyn, the Internet infrastructure company, led to massive disruption to service across Europe and the US with report that the attack was part of a genre aimed at IoT devices using the Mirai botnet. These attacks are at the forefront of security concerns for the emergent IoT and form one of the reasons why Iotic Space is built the way it is to rebuff this threat.
DDoS attacks are not possible in Iotic Space. As outlined above, these attacks are targeted at things that are on the public Internet, often a business’s first brush with the IoT as projects and functions look to exploit ever-expanding internet-enabled products and consumables. The connected things are often attacked using “swamping” and password dictionary attacks.
Because of the double virtualisation built into Iotic Space and the “ioticising” that occurs when a real thing (sensor or device) becomes an Iotic Thing, there is no analogue for this kind of attack within Iotic Space. The governance layers within Iotic Space formed by the Registrar overseeing its Iotic Containers, communicate over HTTPS and have an introductory, brokered public/private key interchange before any communication can occur. Consequently, any DDOS attack is blocked at the Transport Layer.
Any Thing must come into Iotic Space through the agent API. This only communicates with an AMQP broker, not the Container itself. In order to talk to the AMQP broker, the Thing seeking a connection has to present a user ID and password which are assigned to it by the system.
We have built throttling protection into AMQP, so that even if a DDOS-style attack was not blocked at the Transport Layer (which it would be), the mass outages and denial of service described in the article are still prevented.
Any message from a Thing, a Container or the Registrar is cryptographically signed with a token (so can be traced). Each message has a sequence number to prevent replay attacks. All messages are encrypted using TLS 1.2.
Apart from the AMQP brokers, there are publicly available sites that form part of the extended Iotic Space system; namely the Container and Registrar web applications. Both sites are WordPress sites with added security plugins. They have no relation to any Iotic or real Thing, do not store any of the security information above, and only store users’ names and email addresses. The storage of any users’ passwords is encrypted.
DDoS attacks on connected devices are of course not the only threat. Capgemini advised in 2016 that greater collaboration across integrated teams of competencies to model future IoT services and processes was vital. But, how can we model for a future that has not yet arrived?
IoT standards remain, at best forecast, over the horizon (see Figure 1) with the adolescence of the IoT highlighted in Forrestor’s TechRadar™: Internet Of Things Security, Q1 2017 report, which flagged that Internet of Things (IoT) technologies are immature.
Most of the 19 IoT technologies identified in the report are in “Survival and Growth phase today”. Standards long identified as the security solution are “nascent, as vendors are only a couple of years into the process of creating general-purpose interoperability standards.”
But we don’t need to wait for standards.
The virtualisation and the use of digital twins and digital avatars will continue to proliferate (Machina Research). However, to increase their usefulness we need to blend both the real and the virtual abstraction to allow for experimentation, exploration and investigation. “Just start” modelling means that Iotic Space works without the need for “baselining”, and can be made available to your data and digital landscape quickly and efficiently. Even complex environments can be modelled, tested and protected in timescales measured in days not months.
On Iotic Space, digital twins of systems and processes can be created, with “real” systems virtualised to sit alongside and be substituted in and out for their digital twins with no costly or time-consuming redesign. Attack vectors can be analysed on virtualised architectures and counter measures, specialist software and analyses tested and refined.
In Wind River’s much touted briefing from 2015 “Security in the Internet of Things”, challenges were identified around secure booting, access control, device authentication, firewalling and IPS, and updates and patches, all of which are valid challenges. However, the solution is not, as identified, an end-to-end Operating System with security built in, but a true Internet of Things without command and control hierarchies that lead to systemic weaknesses.
Decentralising through an abstraction layer as advocated by Iotic Labs enables scalable solutions to the constantly changing complexity of the IT environment in the IoT age. As Geoff Web outlined in his article “Adapting Security to the Internet of Things” a disaggregated approach allows for the “Las Vegas” model of security, which focuses on tracking specific signals that indicate suspicious behaviour at an individual actor or thing level, rather than by trying to keep track of an entire ecosystem. Metadata allows us to record, understand and infer information about the identity of a device, thing or feed connected to Iotic Space; and our understanding is based upon ontologies, behaviours, catalogues and semantic context of how it should behave. We can therefore set simple triggers for warning and response if something unusual or suspicious occurs. This ability to track and identify helps security professionals keep the IoT secure and, in turn, enable secure interaction with products and services.
Use the best
As the IoT landscape evolves and changes, and as business needs, foci and priorities shift, it is vital that any IoT security can adapt to meet the challenges of an as-yet unwritten future. To meet these challenges, organisations will need to be able to select best-in-class products, services, systems, platforms, and analytical and AI tools, combining them to match best-practice principles and push for competitive advantage.
Iotic Space is an environment enabling interoperability between anything and any Thing. The double virtualisation and abstraction that enables any Thing to interact with any other Thing, subject to correct access controls and brokerage, allows IT professionals and system architects to evolve their digital and data landscapes iteratively over time, and without being locked to a single service provider or technology.
Capital expenditure can be mediated through the creation of hybrid systems that are updated only as needed and in line with business objectives. In an emerging world of:
Currently the fractured nature of the landscape is inhibiting adoption and, in turn, limiting the ability of security and IT professional to develop and implement robust defensive positions and capabilities.
Toptal identified five elements that need to be borne in mind when considering IoT security:
The use of controls as well as feeds enables firmware and software patches and updates to be controlled and managed. Meanwhile flexible interactions enable systems architects to adapt and update systems and processes within the environment with best-in-class analytics, software, hardware and platforms.
It is core to the environment that both the Provider and the Consumer in a data-sharing relationship are known entities within Iotic Space. This enhances trust and security, as only account holders with known credentials can access data.
No data is stored in Iotic Space – this is the antithesis of a Big Data solution – and a less attractive proposition for malicious targeting. Metadata and Data are separated. This enhances security, as the data stream is meaningless without knowledge of the metadata.
8. https://www.theguardian.com/technology/2016/oct/21/ddos-attack-dyn-internet-denial-service / https://www.wired.com/2016/10/internet-outage-ddos-dns-dyn/