IoT connectivity solutions: Media access control layer and network topology

161-Datalink-MAC

Media access control layer and network topology

For IoT applications, the main characteristics of the media access layer control (MAC) that need to be considered are multiple access, synchronization, and network topology.

Multiple Access. Looking back at decades of successful cellular system deployment, one can safely conclude that TDMA is a good fit for the IoT. TDMA is suited for low-power operation with a decent number of devices, as it allows for optimal scheduling of inactive periods. Hence, TDMA is selected for multiple access in the MAC layer.

Synchronization. In IoT applications, there are potentially a very large number of power-sensitive devices with moderate throughput requirements. In such a configuration, it is essential to maintain a reasonably consistent time base across the entire network and potentially across different networks. Given that throughput is not the most critical requirement, it is suitable to follow a beacon-enabled approach, with a flexible beacon period to accommodate different types of services.

Network topology. Mobile networks using a cellular topology have efficiently been servicing a large number of devices with a high level of security and reliability, e.g., 5,000+ per base station for LTE in urban areas. This typology is based on a star topology in each cell, while the cells are connected in a hierarchical tree in the network backhaul. This approach is regarded suitable for the IoT and is therefore selected.

The network layer and interface to applications

The network layer (NWK) and the interface to applications are less fundamental as far as power-efficiency and reliability is concerned. In addition, there is more variation in the field of IoT applications. Nevertheless, it is widely acknowledged that IoT applications need to support the Internet Protocol (IP), whether it is IPv4 or IPv6. In addition, the User Datagram Protocol (UDP) and Constrained Application Protocol (CoAP) could provide the relevant trade-off between flexibility and implementation-complexity on resource-constrained devices.

Furthermore, the IoT will represent an immense security challenge, and it is likely that state-of-the-art security features will become necessary. As of today, we can assume 128 bits Advanced Encryption Standard (AES) for encryption and Diffie-Hellman (DH), or the Elliptic Curve Diffie-Hellman (ECDH) variants, can become the baseline for securing communication.

Internet of Things Connectivity Option: Cellular Network Technologies

400px-Frequency_reuse.svg

Review of Existing Cellular Network Technologies: The Pros and Cons

With all the shortcomings in the incumbent technologies discussed above, one would be surprised by the absence of the most widely used and proven communication technologies by far: cellular system. Indeed, current cellular technologies manage to fulfill some of the requirements for ‘good’ IoT networks, most notably the coexistence of many simultaneously connected devices, absence of interference, high reliability, long range, and capable to service both low-data rate latency-sensitive and high-data rate applications on the same infrastructure. However, current cellular technologies have characteristics that rule them out for most of the emerging IoT applications. This section presents the review of the most prominent existing cellular technologies.

  • 2G (GSM / GPRS / EDGE): 2G is power efficient thanks to its Time Division Multiple Access (TDMA) nature and narrowband 200 kHz channel bandwidth, relatively low-cost, and very long range especially in its 900 MHz band. 2G is not actively maintained and developed anymore, and there should be the possibility of re-farming or even re-auctioning the frequency bands, potentially for IoT technologies.
  • 3G (UMTS / WCDMA / HSPA): 3G is power hungry by design due to continuous and simultaneous (full duplex) receive and transmit using Code Division Multiple Access (CDMA) that has proven to be less power-efficient than TDMA, wide 5MHz channel bandwidth to achieve high data rates (Wideband CDMA), and high complexity especially for dual-mode 2G/3G. WCDMA is not quite suitable for IoT. Even for cellular, WCDMA has evolved back from CDMA to time-slotted High Speed Packet Access (HSPA) for higher data rates, and even pre-allocated timeslots for lower power consumption in HSPA+. In addition, its Frequency Duplex means it has dedicated spectrum for uplink and downlink, such that it is best suitable for symmetric traffic, which is not typical for IoT clients. It is well-known that the battery-life is characteristically shorter when operating in 3G mode compared to 2G mode, either in idle state or during a low data rate, around 12 kbps, voice call.
  • 3G (CDMA2000 1xRTT, 1x EV-DO (Evolution-Data Only)): As an evolution from the first CDMA technology IS-95/cdmaOne developed by Qualcomm shares most of the fundamental characteristics with WCDMA, although with a narrower channel bandwidth of 1.25 MHz.
  • Chinese 3G (UMTS-TDD, TD-SCDMA): Time Division Synchronous Code Division Multiple Access (TD-SCDMA) was developed in the People’s Republic of China by the Chinese Academy of Telecommunications Technology, Datang Telecom, and Siemens AG, primarily as a way to avoid patent and license fees associated with other 3G technologies. As a late coming 3G technology with a single license granted to China Mobile and deployment only starting in 2009, TD-SCDMA is not widely adopted, and will most likely never be (as it will be deprecated by LTE deployments). TD-SCDMA differs from WCDMA in the following ways. First, TD-SCDMA relies on Time Division Synchronous CDMA with 1.6 MHz channel bandwidth (1.28 Mcps). Second, TD-SCDMA uses Time Duplex with dedicated uplink and downlink time-slots. Third, TD-SCDMA uses a narrower channel bandwidth. Fourth, TD-SCDMA has a synchronous network as all base stations sharing a time base. Fifth, TD-SCDMA provides lower data rates than WCDMA, but its time-slotted nature provides better power-efficiency, along with less complexity. Sixth, TD-SCDMA can outperform GSM battery-life in idle state, and can perform similarly in voice call, which is significantly better than WCDMA. Finally, as opposed to WCDMA, TD-SCDMA requires neither continuous nor simultaneous transmit and receive, allowing for simpler system design and lower hardware complexity / cost. These differences actually make TD-SCDMA more suitable than WCDMA for asymmetric traffic and dense/urban areas. Although TD-SCDMA is still too power-hungry to cover the most constrained IoT use cases, it could be considered the most suitable existing cellular technology for IoT.
  • 4G (LTE): 4G is more power-efficient than 3G, has reduced complexity thanks to its data-only architecture (no voice support), and its limited backward compatibility with 2G/3G. It uses Orthogonal Frequency Division Multiple Access (OFDMA) physical layer in a wide channel bandwidth, typically 20 MHz, for delivering high data rates, 150 Mbps and more with MIMO. Interestingly, the requirements for the IoT have been acknowledged and some standardization efforts are aimed at Machine-to-Machine (M2M) lower-complexity and lower-cost. Most notably LTE Release 12 Cat-0 introduces Machine-Type Communication (MTC), which allows for a narrower 1.4 MHz channel bandwidth and lower peak data rate of 1 Mbps with extended sleep modes for lower-power. Release 13 is studying the feasibility of reducing the channel bandwidth further down to 200 kHz with peak data rate down to 200 kbps, with operation in more sub-GHz frequency bands. Release 12 is foreseen to be commercially available in 2017, and Release 13 in 2018 or later [31].

One of the main drawbacks of cellular is the battery consumption and the cost of the hardware. The closest cellular solution to IoT is the Intel XMM 6255 3G Modem, the self-proclaimed world’s smallest 3G modem. The Intel XMM 6255 3G Modem is claiming an area of 300 mm² in 40 nm process (high density at the expense of higher cost and higher leakage, i.e. power consumption in sleep). Power consumption figures are 65 uA when powered off, 900 uA in both 2G / 3G idle state (with unspecified sleep cycle duration) and 580 mA in HSDPA transfer state, with a supply voltage of 3.3-4.4V (nominal 3.8V) . As a matter of comparison, a typical IEEE 802.15.4 / ZigBee SoC using 180 nm process comes in a 7 x 7 mm (49 mm²) QFN40 package with a sleep current below 5 uA and active receive / transmit under 30 mA, with a supply voltage between 2 V and 3.3 V. When normalizing to the same process, there is a 100-fold increase in area from ZigBee to cellular, which relates to the complexity of the receiver and protocol, and translates into a much higher cost and power consumption. This underlines that, although cellular-type protocols could be very suitable for IoT, existing cellular technologies are way too cumbersome and are overkill.

Another drawback of existing cellular technologies is that they operate on licensed frequency bands. This means that a licensee holder needs to manage the radio resource, e.g., a network operator that charges users high rates in order to pay for the expensive spectrum licenses. With the rise of IoT in the coming years, however, we cannot assume that the network operators will stand still. In addition, the regulatory bodies might re-assess the regulatory framework of frequency allocations.

In short, existing cellular network technologies have many characteristics that make them suitable for IoT applications. However, they suffer from the drawback of putting too much pressure on the power consumption of resource-constrained devices. In addition, they operate on scarce and expensive frequency bands. The next section presents a detailed discussion that leverages the beneficial characteristics and addresses the drawbacks of cellular technologies to define the design requirements that make cellular suitable for IoT applications.

Internet of Things wireless connectivity option analysis: Z-Wave Pros and Cons

z-wave_logo

As another asynchronous wireless networking protocol, Z-Wave is designed for home automation and remote control applications. Z-Wave originated from the Danish startup Zen-SYS and was acquired by Sigma Designs in 2008. The Z-Wave Alliance was formed in 2005. Unlike most competing technologies as discussed so far, Z-Wave operates in the sub-GHz bands: 868.42 MHz in Europe, 908.42 MHz in the US, 916 MHz in Israel, 919.82 MHz in Hong-Kong, 921.42 MHz in Australia and New Zealand. The use of sub-GHz bands brings improved range, reliability, and less interference in the Z-Wave network. Nevertheless, there are a few issues worth mentioning when applying Z-Wave for the IoT.

Z-Wave offers limited data rates and mediocre spectrum efficiency due to Manchester GFSK coding (invented in 1948) which doubles the used spectrum for limited coding gain. Originally offering a low data rate of 9.6 kbps, Z-Wave has been upgraded to 100 kbps in the latest version. The Z-Wave network is limited to 232 nodes, yet manufacturers recommend no more than 30 to 50 nodes in practical deployments. Moreover, Z-Wave makes use of relays, such as wall-mounted light switches, to forward packets when devices are out-of-range.

Z-Wave uses a Source Routing Algorithm (SRA), meaning that the message initiator has to embed the routing information into the packet. This implies overhead as the route occupies space meant for the actual data payload. More importantly, this means that the initiator needs to be aware of the network topology. The network topology therefore needs to be maintained and distributed to the nodes that may initiate messages. This is a complex task and is typically not manageable by an end device constrained in computing power, code size, battery capacity, and cost. Z-Wave defines different device types with different capabilities and protocol stack sizes:

  • Controllers: have a full and largest protocol stack as they can initiate messages. The master controller, the Static Update Controller, (SUC), maintains the network topology and handles network management.
  • Mobile controllers: can support request for neighbor rediscovery from moving nodes by implementing the portable controller protocol stack.
  • Routing Slaves: depend on SUCs for network topology and can initiate messages to a restricted set of nodes.
  • Slaves: have the smallest protocol stack, can only reply to requests, and cannot initiate messages.

When using multiple controllers in the same network, only the master (SUC) can be used for network maintenance. Whenever a Z-Wave device is added or removed from the network, the network topology of the master controller has to be replicated manually to the secondary controllers. This process makes network maintenance cumbersome.

The Source Routing Algorithm, along with the network topology management, also makes it very difficult to handle mobility. There is some support for nodes to request for neighbors’ rediscovery, however, this is a complicated and power-consuming process. Taken together this does not provide anything near seamless support for mobility. In addition, Z-Wave also has security flaws, as can be seen from reports of successful attacks on Z-Wave devices.

Overall, Z-Wave has been quite successful thanks to the trade-offs it provides. Z-Wave is a lot simpler than ZigBee, yet it provides a sufficient set of basic functions for simple deployments in home or small commercial spaces. Z-Wave has a good market share for the smart home and smart building by proving the benefits of sub-GHz communication. Nevertheless, its limitations as outlined above prevent it from becoming a future-proof technology for upcoming IoT applications.

Internet of Things Connectivity Option Analysis: IEEE 802.15.4 technologies

Figure1_revised

Originally released in 2003, IEEE 802.15.4 defines a physical layer (PHY) and media access control layer (MAC) on top of which others can build different network and application layers. The most well-known are ZigBee and 6LoWPAN. IEEE 802.15.4 defines operation in the 2.4 GHz band using DDSS to alleviate narrowband interferences, realizing a data rate of 250 kbps. However, IEEE 802.15.4 has a chip rate of 2 Mbps due to spreading. IEEE 802.15.4 also defines operation in sub-GHz bands, but has failed to take full advantage of these frequency bands: IEEE 802.15.4 specification only defines very low GFSK data rates, 20 kbps and 40 kbps, in these sub-GHz bands, and only allows a single channel in the European 868 MHz band (868.0 -868.6 MHz). These restrictions make the 2.4 GHz variants of IEEE 802.15.4 more attractive, accounting for their wider adoption to date.

IEEE 802.15.4g amendment entitled “Amendment 3: Physical Layer (PHY) Specifications for Low-Data-Rate, Wireless, Smart Metering Utility Networks”, was approved in March 2012. IEEE 802.15.4g improves on the low data rates by enabling the usage of more sub-GHz frequency bands, e.g. 169.4-169.475 MHz and 863-870 MHz in Europe, 450-470 MHz in the US, 470-510 MHz and 779-787 MHz in China, and 917-923.5MHz in Korea. In addition, IEEE 802.15.4g introduces Multi Rate FSK (MR-FSK), OQPSK, and OFDM physical layers, all applicable to these sub-GHz bands. Nevertheless, MR-FSK can still only achieve 200 kbps in the 863-870 MHz band in Europe by using Filtered 4FSK modulation. Higher data rates require MR-OFDM, which may prove inappropriately complex for low-cost and low-power devices. With these new physical layers also come additional complexity from the support of more advanced Forward Error Correction (FEC) schemes, and backward compatibility hassle, as supporting the previous FSK and OQPSK physical layers is mandated. Despite the sensible technical considerations that are generally well-suited for powered device such as smart grid and utilities, there is limited availability of 802.15.4g-enabled chipsets. Consequently, IEEE 802.15.4g will take some time for IEEE 802.15.4g to evolve and to grow before it can be proven as a viable option for the IoT.

The most common flavor of IEEE 802.15.4 operating in the 2.4 GHz provides limited range due to fundamental radio theory as mentioned earlier, and is further degraded by the environment. Moisture affects 2.4 GHz propagation significantly (this is why microwave ovens also operate at 2.4 GHz to be specifically well-absorbed by water), and any obstruction, such as a wall, door, or window, would attenuate 2.4 GHz signals more than 1 GHz.

This may be worked around by using multi-hop communication via special relay devices. These relays cannot be regular battery-powered devices since it implies continuous receiving. Literature states that such multi-hop approach increases overall power-consumption. IEEE 802.15.4 is often claimed to be a mesh topology to compensate for the limited radio coverage and reliability. Yet in practice, this is still a hybrid topology because only some particular AC-powered relays can provide relaying. Resource-constrained end devices would still see the network as a star topology.

As can be seen from some studies, multi-hop / mesh topology could be considered a future trend. However, the current single-radio approaches are not suitable for multi-hop and mesh. If relays and devices share the same medium for communication, then a mesh topology is not an efficient solution, as there cannot be multiple devices communicating simultaneously.

Moreover, it  has to be acknowledged that efficiently managing a large number of clients, ensuring their connectivity, and balancing the data flow in a star or tree topology network are already challenging enough not to add an unnecessary overhead of a multi-hop mesh solution.

Finally, IEEE 802.15.4 has not been designed to handle coexistence with other collocated IEEE 802.15.4 networks, or for device mobility. These limitations will prove to be a real problem when the number of connected devices grows dramatically in future IoT applications. Simply imagine the scenario when the nearby apartments within a same building install a compliant IEEE 802.15.4 IoT network and connected objects. IEEE 802.15.4 is not able to handle this situation. Until a solution is found to coordinate with the nearby IEEE 802.15.4 network, IEEE 802.15.4 is not a viable option for the IoT. This holds true for IEEE 802.15.4-based technologies, ZigBee and 6loWPAN, as well as BLE or Z-Wave, which have no provision for this kind of scenario as well.

Connectivity Options for the Internet of Things (IoT)

First, it was M2M – Machine to Machine communication a technology that was suppose to revolutionize our world, but never really took off in a big way due to the cost of embedding cellular connectivity in the end devices. Now M2M has been replaced with IoT – Internet of Things, a better sounding term and hopefully with smarter and cheaper options to connect random things to the Internet. Everyone has their own idea of what the Internet of Things (IoT) is, but one thing is certain that it will increasingly become important in our lives given the ever decreasing cost of wearable devices, sensors, and other monitoring equipment.

It is important to separate over optimistic ignorant hype from actual reality of the technologies. Any software / hardware engineer who ever developed anything useful will tell you that it is easy to define a use case, but a lot harder to actually build a system.

Our aim is describe technology enablers for IoT, especially communication technologies and protocols that will be used for building IoT applications.

Applications for IoT Abound

Despite a lot of vague use cases cited in popular press (and many seem out of science fiction books rather than based on understanding of technology), IoT can be applied to three broad areas:

– Consumer Homes and Personal Networks

– Consumer in his/her Automotive Vehicle

– Industrial and Office Applications

Homes are easy frontiers to deal with for IoT. A typical American home contains home appliances, entertainments systems, temperature and humidly controls units. It is easy to see how a user will like to be able to manage some or all of these devices using his wearable or handheld devices. Most use cases are easy to enumerate by using a general paradigm of control X using Unit Y.

Automotive sector is already integrating all kind of sensors in the car and creating various enablers like Advanced Driver Assistance Systems (ADAS). ADAS can warn drivers from low tire pressure to dangers ahead using a combination of communication and data processing technologies.

Industrial applications for IoT are still in infancy. Most existing M2M applications will be moved to the IoT categories be it the data collection in the supply truck or the manufacturing floor. The amount and type of such applications is only limited by human imagination and the ability of engineers to create them.

Today, these mentioned segments use wireless technologies and Internet interaction, but typically they each focus on what is common within their industry. The chosen wireless solution needs to adequately address the industries’ concerns regarding connectivity options, robust operation, and security features.

Communication Options

The Internet of Things (IoT) is built on an underlying multi-protocol communications framework that can easily move data between embedded “things” and systems located at higher levels of the IoT hierarchy. For designers and application developers, a diverse set of wireless and wired connectivity options provides the glue that holds IoT together.

All IoT sensors require some means of relaying data to the outside world. There’s a plethora of short-range, or local area, wireless technologies available, including: RFID, NFC, Wi-Fi, Bluetooth (including Bluetooth Low Energy), XBee, ZigBee, Z-Wave and Wireless M-Bus. There’s no shortage of wired links either, including Ethernet, HomePlug, HomePNA, HomeGrid/G.hn and LonWorks.

Selecting the best network option for an IoT device, however, requires a careful look at various factors for each situation.

  • The Scale and Size of the IoT Network
  • Data Throughput or Transfer Requirements,
  • The eventual physical location of the embedded devices, the battery size and physical size etc.

Micro-controllers that provide the heart of most embedded or wearable devices already have certain input output integrated. Today, there is a big choice of good, inexpensive, programmer-friendly devices with nice peripherals, low power consumption, and good cross-platform support. You can get cheap Arduino or Raspberry boards just for under 10 dollars.

Just like Micro-controllers, designers do not lack options for wireless connectivity and ICs able to support them. While ANT, Bluetooth®, WiFi and ZigBee may number among the more familiar alternatives, viable wireless connectivity solutions have coalesced around standards including 6LoWPAN, DASH6, EnOcean, Insteon and Z-Wave, among many others. At the same time, smart designers can use proprietary RF approaches. However, for remote and highly mobile applications cellular broadband with LTE or other Wireless connectivity is the only option.

For Wired Devices, Ethernet Rules Supreme

The Internet of Things (IoT) implies connectivity, and developers have lots of wired and wireless options at their disposal to make it happen. Ethernet tends to dominate the wired realm. IoT frameworks map higher-level protocols on this type of connectivity, but the devices don’t work until they have a method of communication with the network.

At this point, Ethernet implementations range from 10 Mb/s up to 100 Gb/s. Of course, the high end generally targets the backbone of the Internet to link server farms in the cloud, while the low to mid-range runs on the rest of the devices. The median implementation these days is 1-Gb/s Ethernet. A new class of Ethernet speeds looms on the horizon. Essentially, 1-Gb/s Ethernet is bumping up to 2.5 Gb/s with a corresponding hop up for higher-speed Ethernet like 10 Gb/s moving to 25 Gb/s. This change essentially provides faster throughput using the same cabling.

Other less common networking possibilities exist on both the wired and wireless side, but are worth mentioning. For example, the HomePlug Alliance’s Powerline networking uses power connections to power the interface as well as a transmission medium. A host of interoperable products include devices such as wireless access points and bridges to Ethernet.

IoT Wireless Technology Selection

Here it really gets interesting. There are several proprietary wireless solutions used in every segment as well as standards including 6LoWPAN, ANT+, Bluetooth, Bluetooth low energy, DECT, EDGE, GPRS, IrDA, LTE, NFC, RFID, Weightless, WLAN (also commonly referred to as Wi-Fi), ZigBee, Z-Wave, and others. We can briefly examine the merits of each.

Wi-Fi When You Need Big Bandwidth

Wi-Fi, with its array of 802.11 variants, provides the highest throughput of wireless technologies at this point. New emerging 802.11ac uses the 2.5- and 5-GHz bands with a combined bandwidth of 5.3 Gb/s. Indoor range is on the order of 100 to 200 feet. The next evolution—802.11ax—is poised to succeed 802.11ac.

A key challenge for IoT developers surrounds power requirements. WiFi communication technology requires far more power than some other technologies. Hence, WiFi option may have to be limited only to devices such as mobile phone, tablets or where it may be possible to deliver wired power like home mounted temperature control sensors and security system components.

Wi-Fi for more power-limited budgets is possible but will have to add techniques to preserve battery lives. For example, a device can send a burst of data at pre-determined intervals and then get to sleep mode.

Bluetooth Classic and Bluetooth-Low Energy (LE)

Bluetooth is a short-range technology utilizing the 2.4- to 2.485-GHz ISM (industrial, scientific, and medical) band. The Bluetooth Special Interest Group manages the technology, with the latest standard being Bluetooth 4.2.

Until smart phones came with media players, Bluetooth was at the verge of almost dying but since then it has come to be embedded in numerous devices. Bluetooth has “classic” and Low Energy (LE) versions; the 4.x standard allows both or either to be implemented. BT- “classic” and BT-Smart/LE aren’t backward-compatible and very different technologies except for the name.

BT-LE is designed to allow for devices that run and communicate for months or years using low-power sources like button cell batteries or energy-harvesting devices. Classic and Smart Bluetooth maximum range is about 100 m (330 feet), while data rate is up to 3 Mbs/s and 1 Mb/s, respectively. However, actual application throughput, like most wireless technologies, is less—2.1 Mb/s for classic and 0.27 Mb/s for Smart.

A new feature in BT-LE is Bluetooth beacons that permit a transfer of information such as device availability, coupons etc at certain intervals. It can be very useful for IoT apps.

ZigBee – Sensor Networking with Scalable Mesh Routing

This is my favorite technology. You can get ZigBee modules cheaply for a few cents, and integrate in any device. It barely uses any battery, runs for a year on a simply battery, and is good for sending periodic sensor data. It can be used for everything from embedded sensors, medical profiling and, naturally home automation processes.

ZigBee is a wireless technology developed as an open global standard to address the unique needs of low-cost, low-power wireless M2M networks. The ZigBee standard operates on the IEEE 802.15.4 physical radio specification and operates in unlicensed bands including 2.4 GHz, 900 MHz and 868 MHz.

A key component of the ZigBee protocol is the ability to support mesh networking of up to 65,000 nodes. In a mesh network, nodes are interconnected with other nodes so that multiple pathways connect each node. Connections between nodes are dynamically updated and optimized through sophisticated, built-in mesh routing table.

Other Low Energy Wireless Options – Zwave, 6LowPan, MiWi, ANT etc.

Just like ZigBee, there are other options some proprietary, some developed by a group of vendors and some coming through other standardization bodies that sit on top of IEEE 802.15.4 physical radio specifications or have their own proprietary radio layers.

Zwave, supported by the Z-Wave Alliance, is another competing technology to Zigbee for home automation projects. Like ZigBee it too supports mesh networking, but is protocol is proprietary. ZigBee chipsets are produced by several silicon vendors, while Z-Wave ones come only from one manufacturer, Sigma Designs.

Z-Wave uses the Part 15 unlicensed ISM band. It operates at 908.42 MHz in the U.S. and Canada but uses other frequencies in other countries depending on their regulations Performance characteristics are similar to 802.15.4, including 100-kb/s throughput and a 100-ft. (30.5 m) range.

In addition, a number of vendor-specific protocols are built on 802.15.4, such as Microchip’s MiWi, which are often lighter weight and have fewer licensing restrictions.

6LoWPAN is a low-power wireless mesh network where every node has its own IPv6 address, allowing it to connect directly to the Internet using open standards. Since, each node has its own IP address all other IP routing protocols can be used.

ANT is an open access multicast wireless sensor network technology designed and marketed by ANT Wireless, now part of Garmin, featuring a wireless communications protocol stack that enables semiconductor radios operating in the 2.4 GHz band (“ISM band”) to communicate. ANT is characterized by low computational overhead resulting in low power consumption by the radios supporting the protocol and enabling low power wireless embedded devices that can operate on a single coin-cell battery from months to years..

In short, 6LoWPAN, ZigBee, ZWAve, MiWi, ANT are all competing for the same space.

Cellular Network Options Are Still Available

Most cellular IoT devices aim to use Long Term Evolution (LTE) 4G and 5G standards. Cellular technology has the advantage of coverage and availability in the large areas. For some devices mounted in the moving trains, trucks, roadside emergency devices, or cars this may be the only viable option.

LTE and LTE-advanced both provide excellent bandwidth throughputs. LTE provides almost like 300 MBits/sec. 4G LTE-Advanced will provide 1 Gb/s, while 5G promises 10 Gb/s.

The major problem is the recurring cost of cellular connectivity since cellular operation requires plans from service providers.

Device Selection Criteria for IoT Designers

IoT is about creating a most efficient, application specific network of connected devices. Connected devices all share five key components:

  • The need for smarter power consumption, data storage, and network management;
  • The need for stronger safeguards for privacy and security;
  • The need high-performance micro-controllers (MCUs); sensors and actuators; and
  • The ability to communicate without losing information.

To narrow down the list of options, compare the technologies from the following IoT key needs:

  • Cost efficiency: Most IoT devices are of low cost, and need affordable radio solutions. So, performance and cost balance are very important.
  • Small size. IoT devices are typically of small size, the radio technology with all its Antenna, battery etc need to physically fit in the housing of the sensor device.
  • Secure Communication. Security of communication is needed. Authentication and data encryption must be supported by the chosen wireless technology. Also, it should be possible to build end to end secure applications.
  • Low power consumption. Since most IoT devices operate on batteries or energy harvesting technologies, the radio technology must have ultra-low power consumption.
  • Strong Available Ecosystem. For any device selection you will need to examine its ecosystems since interoperability with other devices will be important.
  • High Reliability under Noisy Conditions. IoT devices will operate in less than perfect conditions. Hence wireless technology must be able to deal with signal noise, interference and other environmental conditions.
  • Easy to Use. It is possible to leave configurations to experts in the industrial settings, but for consumers ease of plug and play is needed.
  • Radio Range extension capability. Though IoT operates in short distances, it is important that the chosen technology can offer enough range coverage or have some range extension capabilities.

Matching the Design to the Target Market

Despite the bewildering list of connectivity options, system designers find that the best option for a particular IoT device. A design is often constrained based on application needs, performance requirements and environmental limitations. The need for compatibility in established markets may also affect the best connectivity choice.

The good part is that if you are a hardware or embedded system designer, the choices of components is plentiful.

You can find a diverse set of relate hardware solutions including modules and ICs for ANT connectivity from vendors including Nordic Semiconductor, Panasonic and Texas Instruments; ZigBee solutions from Atmel, Freescale and Microchip; and Bluetooth/BLE solutions from CSR, RFM and STMicroelectronics, 6LowPAN devices from TI, STMicroElectronics, Sensinode, Atmel etc.

If you are designing IoT devices or wants to create iOT software and need individual consulting, feel free to connect with me.