LoRaWAN vs. Sigfox vs. Weightless-P: Simulation Results in the “Real World”

In wireless communication, the Hata Model for urban areas, also known as the Okumura–Hata model for being a developed version of the Okumura model, is the most widely used radio frequency propagation model for predicting the behaviour of cellular transmissions in built up areas. This model incorporates the graphical information from Okumura model and develops it further to realize the effects of diffraction, reflection and scattering caused by city structures. This model also has two more varieties for transmission in suburban areas and open areas. (source: Wikipedia)

The Hata Model simulation was conducted for Sigfox, LORA, and Weightless-P with the base station height set at 30m and the end devices heights set at 0.5m. The following simulation was conducted at Ubiik (hardware developers for Weightless-P) but we have checked their math and our team has confirmed the numbers are accurate and unbiased.

Let’s first take a look at the U.S Results (902-928MHz)US compaire.png

 

US2 9.54.52 AM.pngUS3.pngUS 1.png

Now let’s take a look at the results in Europe (863-870MHz). The only difference is LORA is only able to use a smaller bandwidth.

EUR compaire.pngEUR1.pngEUR 2.pngEUR 3.png

 

Let’s see what these numbers mean for an actual Smart Metering deployment (click here)

(If you would like to contribute/make edits/suggestions please contact us at techgu.rooh@gmail.com)

sources: (http://www.ubiik.com/lpwan-comparisons)

IoT connectivity solutions: Media access control layer and network topology

161-Datalink-MAC

Media access control layer and network topology

For IoT applications, the main characteristics of the media access layer control (MAC) that need to be considered are multiple access, synchronization, and network topology.

Multiple Access. Looking back at decades of successful cellular system deployment, one can safely conclude that TDMA is a good fit for the IoT. TDMA is suited for low-power operation with a decent number of devices, as it allows for optimal scheduling of inactive periods. Hence, TDMA is selected for multiple access in the MAC layer.

Synchronization. In IoT applications, there are potentially a very large number of power-sensitive devices with moderate throughput requirements. In such a configuration, it is essential to maintain a reasonably consistent time base across the entire network and potentially across different networks. Given that throughput is not the most critical requirement, it is suitable to follow a beacon-enabled approach, with a flexible beacon period to accommodate different types of services.

Network topology. Mobile networks using a cellular topology have efficiently been servicing a large number of devices with a high level of security and reliability, e.g., 5,000+ per base station for LTE in urban areas. This typology is based on a star topology in each cell, while the cells are connected in a hierarchical tree in the network backhaul. This approach is regarded suitable for the IoT and is therefore selected.

The network layer and interface to applications

The network layer (NWK) and the interface to applications are less fundamental as far as power-efficiency and reliability is concerned. In addition, there is more variation in the field of IoT applications. Nevertheless, it is widely acknowledged that IoT applications need to support the Internet Protocol (IP), whether it is IPv4 or IPv6. In addition, the User Datagram Protocol (UDP) and Constrained Application Protocol (CoAP) could provide the relevant trade-off between flexibility and implementation-complexity on resource-constrained devices.

Furthermore, the IoT will represent an immense security challenge, and it is likely that state-of-the-art security features will become necessary. As of today, we can assume 128 bits Advanced Encryption Standard (AES) for encryption and Diffie-Hellman (DH), or the Elliptic Curve Diffie-Hellman (ECDH) variants, can become the baseline for securing communication.

Internet of Things wireless connectivity option analysis: Z-Wave Pros and Cons

z-wave_logo

As another asynchronous wireless networking protocol, Z-Wave is designed for home automation and remote control applications. Z-Wave originated from the Danish startup Zen-SYS and was acquired by Sigma Designs in 2008. The Z-Wave Alliance was formed in 2005. Unlike most competing technologies as discussed so far, Z-Wave operates in the sub-GHz bands: 868.42 MHz in Europe, 908.42 MHz in the US, 916 MHz in Israel, 919.82 MHz in Hong-Kong, 921.42 MHz in Australia and New Zealand. The use of sub-GHz bands brings improved range, reliability, and less interference in the Z-Wave network. Nevertheless, there are a few issues worth mentioning when applying Z-Wave for the IoT.

Z-Wave offers limited data rates and mediocre spectrum efficiency due to Manchester GFSK coding (invented in 1948) which doubles the used spectrum for limited coding gain. Originally offering a low data rate of 9.6 kbps, Z-Wave has been upgraded to 100 kbps in the latest version. The Z-Wave network is limited to 232 nodes, yet manufacturers recommend no more than 30 to 50 nodes in practical deployments. Moreover, Z-Wave makes use of relays, such as wall-mounted light switches, to forward packets when devices are out-of-range.

Z-Wave uses a Source Routing Algorithm (SRA), meaning that the message initiator has to embed the routing information into the packet. This implies overhead as the route occupies space meant for the actual data payload. More importantly, this means that the initiator needs to be aware of the network topology. The network topology therefore needs to be maintained and distributed to the nodes that may initiate messages. This is a complex task and is typically not manageable by an end device constrained in computing power, code size, battery capacity, and cost. Z-Wave defines different device types with different capabilities and protocol stack sizes:

  • Controllers: have a full and largest protocol stack as they can initiate messages. The master controller, the Static Update Controller, (SUC), maintains the network topology and handles network management.
  • Mobile controllers: can support request for neighbor rediscovery from moving nodes by implementing the portable controller protocol stack.
  • Routing Slaves: depend on SUCs for network topology and can initiate messages to a restricted set of nodes.
  • Slaves: have the smallest protocol stack, can only reply to requests, and cannot initiate messages.

When using multiple controllers in the same network, only the master (SUC) can be used for network maintenance. Whenever a Z-Wave device is added or removed from the network, the network topology of the master controller has to be replicated manually to the secondary controllers. This process makes network maintenance cumbersome.

The Source Routing Algorithm, along with the network topology management, also makes it very difficult to handle mobility. There is some support for nodes to request for neighbors’ rediscovery, however, this is a complicated and power-consuming process. Taken together this does not provide anything near seamless support for mobility. In addition, Z-Wave also has security flaws, as can be seen from reports of successful attacks on Z-Wave devices.

Overall, Z-Wave has been quite successful thanks to the trade-offs it provides. Z-Wave is a lot simpler than ZigBee, yet it provides a sufficient set of basic functions for simple deployments in home or small commercial spaces. Z-Wave has a good market share for the smart home and smart building by proving the benefits of sub-GHz communication. Nevertheless, its limitations as outlined above prevent it from becoming a future-proof technology for upcoming IoT applications.

Internet of Things Connectivity Option Analysis: IEEE 802.15.4 technologies

Figure1_revised

Originally released in 2003, IEEE 802.15.4 defines a physical layer (PHY) and media access control layer (MAC) on top of which others can build different network and application layers. The most well-known are ZigBee and 6LoWPAN. IEEE 802.15.4 defines operation in the 2.4 GHz band using DDSS to alleviate narrowband interferences, realizing a data rate of 250 kbps. However, IEEE 802.15.4 has a chip rate of 2 Mbps due to spreading. IEEE 802.15.4 also defines operation in sub-GHz bands, but has failed to take full advantage of these frequency bands: IEEE 802.15.4 specification only defines very low GFSK data rates, 20 kbps and 40 kbps, in these sub-GHz bands, and only allows a single channel in the European 868 MHz band (868.0 -868.6 MHz). These restrictions make the 2.4 GHz variants of IEEE 802.15.4 more attractive, accounting for their wider adoption to date.

IEEE 802.15.4g amendment entitled “Amendment 3: Physical Layer (PHY) Specifications for Low-Data-Rate, Wireless, Smart Metering Utility Networks”, was approved in March 2012. IEEE 802.15.4g improves on the low data rates by enabling the usage of more sub-GHz frequency bands, e.g. 169.4-169.475 MHz and 863-870 MHz in Europe, 450-470 MHz in the US, 470-510 MHz and 779-787 MHz in China, and 917-923.5MHz in Korea. In addition, IEEE 802.15.4g introduces Multi Rate FSK (MR-FSK), OQPSK, and OFDM physical layers, all applicable to these sub-GHz bands. Nevertheless, MR-FSK can still only achieve 200 kbps in the 863-870 MHz band in Europe by using Filtered 4FSK modulation. Higher data rates require MR-OFDM, which may prove inappropriately complex for low-cost and low-power devices. With these new physical layers also come additional complexity from the support of more advanced Forward Error Correction (FEC) schemes, and backward compatibility hassle, as supporting the previous FSK and OQPSK physical layers is mandated. Despite the sensible technical considerations that are generally well-suited for powered device such as smart grid and utilities, there is limited availability of 802.15.4g-enabled chipsets. Consequently, IEEE 802.15.4g will take some time for IEEE 802.15.4g to evolve and to grow before it can be proven as a viable option for the IoT.

The most common flavor of IEEE 802.15.4 operating in the 2.4 GHz provides limited range due to fundamental radio theory as mentioned earlier, and is further degraded by the environment. Moisture affects 2.4 GHz propagation significantly (this is why microwave ovens also operate at 2.4 GHz to be specifically well-absorbed by water), and any obstruction, such as a wall, door, or window, would attenuate 2.4 GHz signals more than 1 GHz.

This may be worked around by using multi-hop communication via special relay devices. These relays cannot be regular battery-powered devices since it implies continuous receiving. Literature states that such multi-hop approach increases overall power-consumption. IEEE 802.15.4 is often claimed to be a mesh topology to compensate for the limited radio coverage and reliability. Yet in practice, this is still a hybrid topology because only some particular AC-powered relays can provide relaying. Resource-constrained end devices would still see the network as a star topology.

As can be seen from some studies, multi-hop / mesh topology could be considered a future trend. However, the current single-radio approaches are not suitable for multi-hop and mesh. If relays and devices share the same medium for communication, then a mesh topology is not an efficient solution, as there cannot be multiple devices communicating simultaneously.

Moreover, it  has to be acknowledged that efficiently managing a large number of clients, ensuring their connectivity, and balancing the data flow in a star or tree topology network are already challenging enough not to add an unnecessary overhead of a multi-hop mesh solution.

Finally, IEEE 802.15.4 has not been designed to handle coexistence with other collocated IEEE 802.15.4 networks, or for device mobility. These limitations will prove to be a real problem when the number of connected devices grows dramatically in future IoT applications. Simply imagine the scenario when the nearby apartments within a same building install a compliant IEEE 802.15.4 IoT network and connected objects. IEEE 802.15.4 is not able to handle this situation. Until a solution is found to coordinate with the nearby IEEE 802.15.4 network, IEEE 802.15.4 is not a viable option for the IoT. This holds true for IEEE 802.15.4-based technologies, ZigBee and 6loWPAN, as well as BLE or Z-Wave, which have no provision for this kind of scenario as well.

Internet of Things Wireless Connectivity Option Analysis: Pros and Cons of Bluetooth Classic, Bluetooth Low Energy, and CSRmesh

Nordic-Semiconductor-launches-the-Blue-nRF8002-a-low-cost-ultra-low-power-uniquely-easy-to-design-in-single-chip-solution-for-Bluetooth-Smart-tags-and-accessories

Analysis of the major Bluetooth technologies, including Bluetooth Classic, Bluetooth Low Energy, and CSRmesh as solution for the last 100m of IoT connectivity.

Bluetooth Classic

Bluetooth Classic, also standardized as IEEE 802.15.1 in 2002 and revised in 2005 (although this standard is not maintained anymore), was invented in 1994 as a replacement for RS-232. Bluetooth Classic operates in the 2.4 GHz band and is limited to a small number of eight devices. Because of the following reasons, Bluetooth Classic is not a suitable protocol for IoT applications:

  • Bluetooth Classic was designed to provide low-latency wireless peripherals and has evolved to provide high data rates. This is achieved at the expense of power consumption.
  • The physical layer (PHY) of Bluetooth Classic only supports long packets (up to 2745 bits of payload) with mandatory channel encoding. This enables higher throughput, however, this is not suitable for resource-constrained devices.
  • The protocol stack of Bluetooth Classic has grown in complexity and can typically be 128 kB of code size, which is not satisfactory for IoT embedded devices.
  • Bluetooth Classic’s loose specification on the modulation index range does not make it easy to improve the receiver performance in the future. Consequently, Bluetooth Classic has poor coverage, typically less than 10 m.
  • With a 3-bit address for piconet space, Bluetooth Classic is limited to having a maximum size of 8 connected devices, which is obviously insufficient for IoT applications.

Bluetooth Low Energy (BLE)

BLE also known as Bluetooth v4.0 or Bluetooth Smart originated from Nokia’s WiBree. Contrary to belief, BLE is actually not compatible with Bluetooth Classic since the physical layer (PHY) has been re-designed. BLE is using a fixed data rate of 1 Mbps and GMSK modulation. BLE uses short packets, and is suitable for low-latency proximity communication. Unfortunately, BLE has the following issues that make it less suitable for IoT applications:

  • BLE is operating in the crowded 2.4 GHz frequency band, along with Bluetooth Classic, Wi-Fi, ZigBee, and IEEE 802.15.4. This spectrum crowding will pose a severe reliability challenge to all 2.4 GHz devices, and the problem will only get worse when the number of connected object increases.
  • BLE is optimized for low-latency sporadic transmissions and therefore its efficiency degrades dramatically for larger data transfers. With its maximum of 20 bytes application payload size per packet, the gross 1 Mbps data rate of BLE translates into a theoretical maximum transfer rate of 250 kbps, and in practice the actual transfer rates drops below 100 kbps. This opposed to Bluetooth Classic v1.2 that achieves 700 kbps, and v2.1 + EDR reaches 2 Mbps actual transfer rate. An actual transfer rate of only 1/10 of the gross data rate is rather lackluster and translates into poor power-efficiency for such type of data traffic. Although many IoT applications may have a limited data amount to transfer, e.g., for switching off or changing the color of a light bulb, others would still require slightly larger transfers. As a result, BLE is not suitable for IoT applications that require higher data transfers.
  • BLE has limited range and extending the network therefore requires a hybrid topology where some client nodes act as server nodes for other star networks. In Bluetooth-specific terminology, this is called scatternet, which yields high network complexity in real deployments. For instance, BLE is essentially asynchronous, such that this hybrid topology (mix of star and mesh) causes increased interference and increased power consumption, even inside a single network.
  • Finally, BLE suffers from interference from USB 3.0, and poses a challenge when operating with collocated LTE or WIMAX networks. This is reflected in Bluetooth SIG filtering recommendations. However, workarounds are developed as well.

CSRmesh

In February 2014, CSR plc, formerly Cambridge Silicon Radio, announced the availability of their proprietary CSRmesh software. CSRmesh operates over Bluetooth Low Energy (BLE) with the aim to enable mesh topology over the restrictive BLE scatternet topology and to provide direct communication between BLE devices. However, we want to note the following:

  • The main advantage of CSRmesh is to allow smartphone connectivity. It is still questionable whether this connectivity should be achieved via direct connection to any device or more simply via a gateway or routers, e.g., Wi-Fi or BLE-enabled routers, or even through cellular if a device is out of range.
  • Turning BLE into a mesh-able protocol is not that straightforward. Even if BLE in itself is power-efficient for low duty cycle and small data packets, enabling the mesh functionality would require each device to simultaneously be an observer and broadcaster. This implies that each device would continuously listen for advertising packets, and would then switch to advertising the received data for some period.
  • The inefficient use of the radio resources inherent to continuous receive would make it difficult to achieve ultra-low-power consumption in resource-constrained devices. As reported on CSR Forums, there happened to be a current consumption in idle state of around 3mA, which is 100x more than people would expect for a battery powered IoT device. In short, the asynchronous nature of BLE, optimized for low duty cycle / sporadic transmission, seems to offer a challenge for the implementation of a power efficient mesh topology on top of the exiting BLE protocol stack.
  • Allowing direct smartphone connection to every device may not provide additional functions. On the contrary, as discussed above it will drain the battery of the device. In addition, it is a potential security threat because there is no gateway with sufficient computing power to filter access and enable strong authentication security.

questions / comments? fire away!