When a massive tsunami hit Japan in March 2011, crippling several of the country's nuclear reactors in Fukashima, the most detailed information about the spread of radiation did not come from government sources, but from web-connected Geiger counters in people's homes.
Using a web development platform called Pachube (now known as Cosm), developers were able to create applications that aggregated real-time Geiger counter data, along with location and wind speed/direction, to produce a map of where the the highest levels of radiation were and where it might spread due to the wind.
This is a real-life example of machine-to-machine (M2M) communication, the technology behind the Internet of Things that could enable smart cities of the future. M2M allows electronic devices to communicate with one another via SIM cards that can connect to wireless sensors and the mobile internet for management and monitoring, and to provide services.
Momentum behind the technology has been growing at a rapid pace, due to the falling cost of sensors and processors and the introduction of operational smart meters that allow people to monitor and control their energy use, proving that M2M is more than just a pipe dream.
However, the Internet of Things is about much more than just home energy monitoring – it has a role to play in applications and services as diverse as remote health monitoring and diagnosis, dynamic energy loading, and traffic management and road pricing.
For example, US-based company Vitality GlowCaps offers a connected pillbox that reminds patients to take their medication by flashing and playing a ring tone, automatically orders refills from the local pharmacy, and keeps the doctor updated on the patient's adherence to prescriptions.
Logistics firms such as UPS use M2M in their vehicle fleets to optimise driving routes and provide live package tracking information for customers, while several insurance companies now offer usage-based insurance packages which set rates based on actual driving habits.
Forecasts by analysts promise anything from 12 billion to 50 billion connected devices worldwide by 2020 – up from just 1 billion in 2010 – and in the UK, the public and private sector alike are investing in the sensors and networking equipment needed to make future connected cities a reality.
The government recently signed a partnership with chip manufacturer Intel to turn London into a testbed for smart technologies ahead of the Olympics, and is also backing a new smart cities consortium led by Living PlanIT to develop applications that will enable communities to live and work in “an intelligent, efficient and sustainable urban environment”.
The Technology Strategy Board (TSB) has launched a competition urging local authorities to turn their cities into “smart cities” by integrating transport, communications and other city infrastructure, for the chance to win £24 million of government investment.
Meanwhile, chip manufacturers such as Intel and ARM have been investing in M2M processors, and telcos have also been getting involved, with Telefonica UK signing a deal with software developer Jasper Wireless to deploy an M2M management platform last month, and Deutsche Telekom launching the first online marketplace for M2M technologies.
Even the academics are playing their part; computer scientists at the University of Glasgow recently announced plans to develop an internet search engine that uses sensors located in the physical world to provide answers to search queries.
While the industry is tackling many of the technological challenges thrown up by the Internet of Things, one of the biggest barriers to M2M growth is the shortage of accessible data. Putting sensors in appliances is all very well, but if that data cannot move outside the building it becomes difficult to exploit.
According to Maurizio Pilu, Lead Technologist (Digital) at the TSB, developers need to provide incentives for organisations to unlock data by demonstrating how it can be used to create new applications and services through APIs and other mechanisms.
“You have to unlock data, you have to break silos, you have to think about service enablement on top of that data, and you have to think about what you can do with that service enablement to think up new applications and services that were not possible before,” he said.
Opening up access to data is not a simple process, especially if that data is personal, as it raises security and liability issues if it gets into the wrong hands. These concerns are very real and very worrying. For example, if the sensors in your house can be used to detect whether you are at home, it becomes very easy for criminals to break in.
However, the rise of social networks such as Facebook and Twitter has shown that the market is willing to tolerate a certain amount of personal data release in exchange for valuable services, and for the companies that get the right balance, there is real opportunity.
“Opening up data might conjure images of everything being free, but it's really about moving one step further and showing there are benefits,” said Pilu. “We're not trying to boil the ocean. Once you prove the incentives are there, the private sector will buy in because they will see the opportunity.”
One way of unlocking data would be to make M2M systems more open and interoperable. Pilu said that adoption is currently being hindered by companies building proprietary systems that lock in data, creating siloed microcosms of cities, solutions or applications.
“History teaches us that while it's OK up to a point, the market really takes off when those microcosms start talking to each other, start actually having something in common,” he said.
David King, CTO of business and technology service company Logica, said that even if M2M systems are initially created for a specific purpose, they should be based on open standards, so that they can be programmed to interoperate in the future.
“You start out with the things that are immediately achievable, but you want the big picture in mind that you can move from that stage to more adventurous stages as you get more rollout and better understanding,” he said.
Work with what you've got
Despite the need to unlock more data, there is already an enormous amount of available data that is currently not being used. Pilu said that the Internet of Things is about allowing people to do what they already do better, faster, cheaper and more conveniently.
For example, if an old lady is living alone, she might want a system to alert doctors if she has an accident. This could mean putting cameras all over her house or putting sensors all over her body, but both of these options are very invasive. Instead, her existing energy meter could flag up any irregular occurrences, like the boiler remaining switched off on a cold night, or the kettle not being switched on in the morning.
“By checking data that is available, you can achieve quite a lot of silent monitoring,” said Pilu. “It's not about spending money on more sensors, it's about taking whatever is out there, building on top of it and trying to prove that there are plenty of reasons for doing more of that.”
It is not just existing data but also existing infrastructure that can be used to enable M2M. Although many assume that mobile networks need to be upgraded before the Internet of Things can become a reality, King claims that M2M can work very well at respectable scale on existing networks.
However, as the market for M2M grows, and connected devices start to proliferate, mobile networks are going to start looking very different, according to King.
“M2M works fine but it is sitting on telecoms networks that are tuned for people. At the moment it looks like a relatively small number of connections for any mobile phone tower, each of which is carrying a relatively large amount of data,” he said.
“It's a completely different picture for M2M. It's ten, a hundred, maybe a thousand times as many devices as people, each with a tiny amount of data, and that's a very different profile for the radio networks to carry, so the scaling becomes different.”
He added that billing, service assurance and provisioning of new SIM cards is also very different for machines compared to humans, so telcos are going to need a different service management infrastructure, and businesses will need to be prepared to take these services to the end users.
Opportunities and standards
While a great deal can be done with the data and infrastructure that is already available, developments in networking and storage technology are likely to have a big impact on the Internet of Things. In particular, the shift to the sixth version of the IP addressing scheme (IPv6) means that there will be 340 trillion trillion trillion addresses – enough for every electronic appliance to have an IP address.
Moreover, the arrival of 4G mobile technology will provide more network capacity, and will also be able to deal with many more data units in smaller chunks, according to Pilu. Innovations in cloud storage and big data analytics will also help organisations manage and process the vast quantities of data collected via M2M systems, and turn it into meaningful information.
A recent report by the Economist Intelligence Unit, sponsored by SAP, states that early M2M applications sought to connect devices directly to each other, whereas now all devices can report their status to the cloud, allowing customer applications or analytics to be performed at a higher aggregate level.
As with other technology developments, however, the maturation of M2M calls for the implementation of standards and protocols to enable greater interoperability. Pilu said that using the internet as the platform for the Internet of Things has allowed it to grow organically.
“That's what a lot of standards are moving towards, they are thinking about IP as the bread and butter for this kind of stuff, rather than proprietary standards and so on, so there is going to be a lot of hybrid solutions for a long time,” he said.
Pilu added that standardisation would allow a lot more devices to work together and achieve economies of scale, and that component manufacturers would be able to develop for hundreds and thousands of applications without worrying about how the application works.
King praised the European Telecommunications Standards Institute (ETSI) for partnering with several other standards agencies to create a global M2M standard, but said that it needs to go further.
“ETSI has done quite a good job on getting draft standards out for devices and platforms, although not for the applications software yet, so that will help. But nevertheless there are a lot of proprietary platforms out there,” he said. “Having standards makes it possible to create the environment where the market can naturally grow.”
Internet of Things
The market potential for M2M is clearly massive, with technology market research firm ABI Research predicting annual revenue of $35 billion in 2016. Given the diverse applications of M2M, however, some sectors will inevitably grow faster than others.
With the public sector supporting smart city projects, perhaps it will not be long before our street lamps dim automatically, our buildings self-regulate temperature, and our traffic jams magically untangle themselves; meanwhile, private investment in healthcare, automotive and energy M2M could transform the way these sectors operate.
Just as the mobile operators only realised their full potential when they became interoperable, so M2M will arguably only become ubiquitous when the ecosystem becomes more open, and provides a platform for others to build on. Cheap devices, modified networks and systems integrators will all have their role to play in driving the M2M.
Ultimately, however, the success of M2M will rely on providing useful and compelling applications that ensure data security and preserve people's right to privacy. If companies start getting that right, it will only be a matter of time before the wider world catches on, and makes the global Internet of Things a reality.