Technology behind the Internet of Things

As the Internet developed, more kinds of (increasingly mobile) computing devices became connected, and Web servers delivered ever-richer content with which they could interact. Although this first Internet/Web revolution changed the world profoundly, the next disruptive development, in which the majority of Internet traffic will be generated by “things” rather than by human-operated computers, has the potential to change it even more.

This “Internet of Things” (IoT), “Machine to Machine” (M2M) communication, is well underway—after all, microprocessors are to be found in all manner of “things”: domestic white goods, cars, credit cards, your passport, your family pet, the CCTV camera in your street, the lift (elevator) in your office, and many more. Add the magic ingredient of Internet connectivity (or the ability to be read by an Internet-connected device), bake with applications and services that make use of the data gathered by this vastly expanded
network, and you’ve cooked up another technology revolution.

Data transfer patterns in the M2M-driven Internet of Things will differ fundamentally from those in the classic “human-to-human” (H2H) Internet. M2M communications will feature orders of magnitude more nodes than H2H, most of which will create low-bandwidth, upload-biased traffic. H2H internet traffic is rather download-biased with high bandwidth. Many M2M applications will need to deliver and process information in real time, or near-real-time, and many nodes will have to be extremely low-power or self-powered (e.g., solar-powered) devices.

The “things” in the IoT, or the “machines” in M2M, are physical entities whose identity and state (or the state of whose surroundings) are capable of being relayed to an Internet-connected IT infrastructure. Almost anything to which you can attach a sensor—a cow in a field, a container on a cargo vessel, the air-conditioning unit in your office, a lamppost in the street—can become a node in the Internet of Things.

Sensors are the components of “things” that gather and/or disseminate data—be it on location, altitude, velocity, temperature, illumination, motion, power, humidity, blood sugar, air quality, soil moisture… you name it. These devices are rarely computers, as we generally understand them, although they may contain many or all of the same elements (processor, memory, storage, inputs and outputs, OS, software). The key point is
that they are increasingly cheap and plentiful and can communicate, either directly
with the Internet or with Internet-connected devices.

Local communication: All IoT sensors require some means of relaying data to the outside world. There’s a plethora of short-range, or local area, wireless technologies available, including: RFID, NFC, Wi-Fi, Bluetooth (including Bluetooth Low Energy), XBee, Zigbee, Z-Wave, commodity remote control and sensing communication and Wireless M-Bus. There’s no shortage of wired links either, including Ethernet, HomePlug, HomePNA, HomeGrid/G.hn, and LonWorks.

For long range, or wide-area, links, there are existing mobile networks (using GSM, GPRS, 3G, LTE, or WiMAX for example) and satellite connections. New wireless networks, such as the ultra-narrowband SIGFOX and the TV white-space Zero-G, are also emerging to cater specifically to M2M connectivity. Fixed “things” in convenient locations could use wired Ethernet or phone lines for wide-area connections. Some modular sensor platforms, can be configured with multiple local- and wide-area connectivity options (ZigBee, Wi-Fi, Bluetooth, GSM/GPRS, RFID/NFC, GPS, Ethernet). Along with the ability to connect many kinds of sensors, this allows devices to be configured for a range of vertical markets.

Some types of M2M installations, such as a smart home or office, will use a local
server to collect and analyse data—both in real time and episodically—from assets on
the local area network. These on-premise servers or simpler gateways will usually also connect to cloud-based storage and services.

“Things” with short-range sensors will often be located in a restricted area but not permanently connected to a local area network (RFID-tagged livestock on a farm
or credit-card-toting shoppers in a mall, for example). In this case, local scanning
devices will be required to extract data and transmit it onward for processing.

If you think today’s Internet generates a lot of data, the Internet of Things will be another matter entirely. That will require massive, scalable storage and processing capacity, which will almost invariably reside in the cloud—except for specific localised or security-sensitive cases. Service providers will obviously have access here, not only to curate the data and tweak the analytics, but also for line-of-business processes, such as customer relations, billing, technical support, and so on.

Subsets of the data and analyses from the IoT will be available to users or subscribers, presented via easily accessible and navigable interfaces on a full spectrum of secure client devices. M2M and the Internet of Things has huge potential, but currently comprises a heterogeneous collection of established and emerging, often competing, technologies and standards (although moves are afoot here). This is because the concept applies to, and has grown from, a wide range of market sectors.

(from “The tech behind M2M and the Internet of Things” by Charles McLellan in “The Executive’s Guide to the Internet of Things” Tech Republic, ZDNet, 2013)

electric imp

Electric Imp have built and is still building a core connectivity platform within the Internet of Things through an innovative solution that delivers a powerful cloud service tied closely to leading-edge hardware, making it simple to connect devices to the Internet. Their unique WiFi-enabled solution dramatically decreases cost and time-to-market and is reliable and secure, empowering manufacturers and developers to manage and quickly scale their connected products and services to millions of users.

www.electricimp.com

The Internet Connection

Bild

While the local wireless communication in the house is already solved since many years (see above), the internet connection leaves some challenges. Device manufacturers lack the expertise to solve the connectivity puzzle.

Core Requirements of a Connected Product

It all starts with the choice of the wireless chip, studying the networking protocols, defining a hardware design, programming the processor, solve security and approval issues . Then the servers have to be setup, software has to be released and managed, network firmware defined and finally everything prepared for the end user. This is a 1 to 2 or even more years exercise. Time-to-market today in consumer electronic of more than 1 year is suicide.

The hardware manufacturers have a lack of the necessary domain expertise and fear the challenge to work with prototype or experimental solutions. But this approach is also in high cost and need of resources. Hardware manufacturers have to invest largely and prepare themselves for ongoing operational support. Iterations on solutions are difficult because of the development cycles and so the best consumer value gets lost. Last but not least security and privacy is hard to identify for hardware manufacturers. Especially with the cultural and political differences between Europe, where the technology shall be used, and Asia, where it is manufactured, are big in these terms. But even in Europe it will be difficult to identify the tipping point between usefulness and invasiveness.

Software Compatibility

Home automation and security devices are not standardized on the software level. Their functions for the users are well described (everybody has a certain expectation what a door sensor or switchable plug does)  but how the signals are processed is proprietary to every manufacturer. If a brand wants to offer a solution that works across the different sensors and switches, they have to come to their own standards and force the suppliers to match these standards.

The devices (sensors and switches) provide in principle very simple data compared to a camera: on/off, open/close, smoke/no smoke, motion detected or not for the lowest level and a single value for dimming the light or reading a temperature. Data also comes in a much lower frequency than from a camera: 1 bit per minute (or less) compared with 1 MBit per second.

The devices deliver or receive this small amount of data through a proprietary wireless channel that will be connected to a gateway. It’s not the device that has to be changed for a standardized approach, it’s sufficient if the gateway communicates with a number of devices, understands their protocol and translates it into a standard format. Software on the cloud server can deal with the standardized data format without adapting to manufacturer’s specialties. All hardware and manufacturer dependent information is kept in the firmware of the gateway.

Cloud Server Operation

Every gateway manufacturer has to think about a cloud server solution. Although it is not too difficult for IT professionals to set-up such a server either in the company’s server room or on a rented server, the total cost of this server installation is quite significant. With home automation and security applications in mind no customer will accept downtimes. Therefore server operation has to be 24 hours each 7 days a week, which leads to 4 shifts of operating personnel. Even in low-salary countries the annual cost for wages and server will easily reach USD 60,000. In Europe it’s rather USD 100,000 per year or more. This cost is only justified, if there are enough users to share the cost. 10,000 users would have to pay each at least USD 10 per year, which will be difficult to explain to customers today.

That means that server functions need to be outsourced to companies that serve more users and can come to better rates. A target is USD 1 per year.

It also makes clear, that the server operation should not stay in the responsibility of the hardware manufacturers. If a brand doesn’t want to depend on one hardware manufacturer only, each hardware manufacturer would offer his own solution. That means a higher risk of failure and also higher cost. Even if the hardware manufacturer doesn’t charge directly for the cost, they will either develop on his side and he has to charge higher prices for his products or they will run risks while operating their servers.

Addressing devices

For internet access, sensors and controls have to either be directly connected to an internet node (e.g. by Wifi) or a gateway gathers local data and makes them available for the internet. The gateway is inside a local network and cannot easily be reached from outside. As long as IPv4 addresses dominate the market, households have no static IP address but are serviced through by system of changing IP addresses. With that it is difficult to address hardware inside the network from outside. There are DDNS servers available which offer a dynamic name association with servers inside the local network. The set-up of such services will typically not be executed by consumers.

With IPv6 addresses this will change, since every device in internet can get its individual address and with that can be reached from everywhere. Although it is technically feasible today it will still take 5 to 10 years, before IPv6 becomes the standard addressing scheme.

Before that is solved, connections to devices inside a local network have to be made by cloud server, where both, the device and the human or other machine interfaces have to register first, before they are connected by the server.

Wireless technologies and cost aspects

In order to let the users do the installation of home automation and security devices themselves it is important that these devices are wireless. At the same time sensors and controls should be accessible through internet, so that smartphones and PC are able to connect with independent where they are.

The current technologies for wireless communication include

-       Infrared (as for remote controls)

-       Radio frequency communication in the free ISM bands (in Europe 433MHz, 868MHz, 2,4GHz) with standard (such as Z-Wave, Enocean or Zigbee) or vendor proprietary protocols

-       Bluetooth (especially version 4.0 low energy, BLE)

-       Wifi according IEC 802.11bgn… (as we use it for our numerous IT applications)

-       Radio frequency communication in TV white space (“Weightless”, 400 to 800 MHz)

-       Cell phone communication according to GSM or CDMA standard

To connect to internet each of these technologies need to use a gateway to the cable bound copper or fiber optic internet grid. Some of these gateways are in the same house where the sensors and controls are (ISM, Wifi), some are part of the network infrastructure outside of the house (Cell phone, Weightless).

The current markets for remote sensors and controls concentrate on a local (10m to 300m range) connection. An alarm system can connect a number of devices to a base station. The base station has the user interface for arming/disarming, status display and alarm functions. A remote controlled power plug can be reached by a handheld battery operated hand set. That means that there exists already a mature supplier market for most of the remote controlled functions. That is good news, since the time to market is extremely shortened if you can already access a whole portfolio of products that can connect to the real world.

Another important aspect is the cost per intelligent device. Intelligence is typically defined by a micro controller unit (MCU), a micro computer with peripheral functions such as analog or digital input or output signals, and a wireless communication interface. The cost of this functionality heavily depends on the technology. The table below gives an indication of today’s and next year’s cost of each node:

Technology Node cost 2013 in USD Node cost 2014 in USD
Infrared 0.80 0.80
RF 433 MHz 1.50 1.50
RF 868 MHz 2.50 2.20
Z-Wave 6.50 6.00
Zigbee 5.00 4.00
Bluetooth 3.00 2.00
Wifi 15.00 10.00
GSM 12.00 11.00

Comparison of node cost of different technogies