How Low Power is NB-IoT?


NB-IoT is considered a licensed low power wide area networks (LPWANs) technology supported by your local telecom operator. That means each device requires a SIM card and monthly or annual payments to your operator just like your cell phone. The benefit is that you don’t need to manage the infrastructure; you do not need to install your own base stations.

The key advantage of NB-IoT is the protocol is synchronous and designed to optimize the spectral usage and throughput of the network. This optimization for spectral utilization comes at the cost of compromised battery life and recurring costs (monthly or yearly).

Unlicensed LPWANs, such as LoRaWAN® and Weightless™ are optimal for longer battery lifetime. If sensor data is small and infrequent (one or twice a day), LoRaWAN could be the optimal choice being an asynchronous protocol. If data is larger and data transmissions must be acknowledged then Weightless stands alone as the option for a synchronous protocol for private networks.

The effect of asynchronous versus synchronous protocols has significant impact on the battery lifetime of sensors.

Semtech conducted a comparison using the T-Mobile NB-IoT network available in the U.S. The NB-IoT sensor consistently took more than 20 seconds of active time to negotiate a slot to communicate an 11-byte packet. The average current consumption over this 20 second period was 40mA. In comparison, sending the same 11-byte packet over LoRaWAN required an active time of only 1.6 seconds, with an average active current consumption of 6.4mA. This translates into greater than 50 times advantage in battery lifetime for LoRaWAN.

Take an example application: wireless, battery powered, pushbutton. A LoRaWAN-enabled pushbutton and an NB-IoT pushbutton each were equipped with a 600 maH battery. The LoRaWAN device could support roughly 70,000 button presses on a single battery,  while the NB-IoT button could handle only about 2,000 button presses on a single battery. The difference is quite drastic.

When choosing a LPWAN technology, be sure to thoroughly review the application requirements. One size does not fit all.



IoT connectivity solutions: Media access control layer and network topology


Media access control layer and network topology

For IoT applications, the main characteristics of the media access layer control (MAC) that need to be considered are multiple access, synchronization, and network topology.

Multiple Access. Looking back at decades of successful cellular system deployment, one can safely conclude that TDMA is a good fit for the IoT. TDMA is suited for low-power operation with a decent number of devices, as it allows for optimal scheduling of inactive periods. Hence, TDMA is selected for multiple access in the MAC layer.

Synchronization. In IoT applications, there are potentially a very large number of power-sensitive devices with moderate throughput requirements. In such a configuration, it is essential to maintain a reasonably consistent time base across the entire network and potentially across different networks. Given that throughput is not the most critical requirement, it is suitable to follow a beacon-enabled approach, with a flexible beacon period to accommodate different types of services.

Network topology. Mobile networks using a cellular topology have efficiently been servicing a large number of devices with a high level of security and reliability, e.g., 5,000+ per base station for LTE in urban areas. This typology is based on a star topology in each cell, while the cells are connected in a hierarchical tree in the network backhaul. This approach is regarded suitable for the IoT and is therefore selected.

The network layer and interface to applications

The network layer (NWK) and the interface to applications are less fundamental as far as power-efficiency and reliability is concerned. In addition, there is more variation in the field of IoT applications. Nevertheless, it is widely acknowledged that IoT applications need to support the Internet Protocol (IP), whether it is IPv4 or IPv6. In addition, the User Datagram Protocol (UDP) and Constrained Application Protocol (CoAP) could provide the relevant trade-off between flexibility and implementation-complexity on resource-constrained devices.

Furthermore, the IoT will represent an immense security challenge, and it is likely that state-of-the-art security features will become necessary. As of today, we can assume 128 bits Advanced Encryption Standard (AES) for encryption and Diffie-Hellman (DH), or the Elliptic Curve Diffie-Hellman (ECDH) variants, can become the baseline for securing communication.

Best Electronic Shelf Label Companies

Ranking The Best Electronic Shelf Label Solution Providers.

One IoT case that fascinates me is the smart retail sector specifically, Electronic Shelf Labels.The solution replaces traditional paper price tags with connected digital price tags. Store owners can change prices instantaneously opening up a myriad of opportunities ultimately increasing store efficiency, enhancing the customer experience, optimizing inventory, and boosting revenue. Thousands of connected nodes, bi-directional communication, extremely low battery-consumption, speedy transmission to the cloud – this case oozes with great IoT flavors and it is not just a concept, it is live NOW in stores worldwide.

I will rank the current ESL vendors based on their over hardware solution, wireless connectivity solution, and demonstrations provided at the NRF Big Show and EuroCIS: two of the biggest retail shows. Every main ESL player had a booth and as a die-hard geek, I took the time to do an in-depth evaluation of each.

Worth noting: I learned the E-paper displays were all identical as there is only one worldwide vendor of the technology – E Ink. What really actually makes the ESL solution work is the wireless connectivity solution and hardware simplicity.

Here is my ranking of the best Electronic Shelf Label Companies focusing on retail.

#1  m2communication -logo NEW

Headquarters: France

M²Communication is as they said “the new kids on the block” but there is a reason this new player has emerged with significant traction.

This company is comprised of radio frequency chip-set makers. As mentioned, the wireless communication aspect of ESL is actually what makes the whole solution “work”. They developed their own sub GHz wireless communication protocol from scratch and it can do A LOT more than just ESL (I took at look at their whitepaper). The salesmen at their booths are clearly engineers wearing suit and ties which was quite a refreshment from the car salesmen at all the other booths. They were very honest and transparent in their business status. The big selling point – their demo. All the other booths had some pretty awful demos. M²Communication‘s actually worked. They had about 100 price tags on display on a wall. They allowed me to use their web based interface to change all prices to my satisfaction. They probably regretted letting me take the reigns because I spent about 30 minutes on their laptop not only changing prices but changing images and small product details. To my delight a couple seconds after I pressed the “update” button, all the tags began flashing one by one showing the new price and content. True two-way communication as each of the tags relayed the battery life and signal strength back to the computer.

HUGE differentiation – hardware simplicity. Their solution is plug and play. Their access point, responsible for communication from the store’s system to the tags is the size of a computer mouse. No professional installment required.

Definitely the most technically sound solution in the market right now. Let’s just see how strong their sales and marketing team is as they try to push this pass the giants.

#2 DD-Master-logo-CMYK.jpg

Headquarters: United Kingdom

Displaydata has a bunch of car salesmen at their booth that I felt were reading from a slide deck when I asked them technical questions. One guy went as far as telling me their display resolution was the best in the industry. I had to break the news to him that there is only one worldwide e-paper display vendor achieving identical DPI (dot per inch) . (He still insisted their displays are superior)

It took me a few tries to get to the booth’s “technical guy”. Their communication is also like M²Communication‘s: a sub GHz proprietary protocol. They did not design it themselves; they actually outsourced that work to another company who they did not wish to disclose.

I think connectivity in the sub GHz is the way to go. It avoids crowded frequencies such as 2.4 GHz crowded by Wi Fi , bluetooth, etc. Anyways the reason I have them ranked #2 is because of their bulky expensive hardware and their demo. Their “dynamic communicator” responsible for transmitting and receiving data from the tags was fairly large and needs professional installation. I was orally quoted $650-750 USD per “dynamic communicator” and larger supermarkets would need up to 10 of these giants in each installment. As far as their demo, it actually failed the first time. And with me you only get one first impression. It did eventually start working. And they were achieving relatively the same updates speed as M²Communication but only used 2 tags for their demo 😦

This company seems to have a lot of man power and are touting some impressive deployments in the supermarket industry. Good things coming for this company.

#3 SES / Imagotag

Headquarters: Austria

SES is the oldest largest ESL vendor. Their original wireless communication solution uses SUPER DUPER low RF frequency: 36KHz! The transmit speed is SUPER DUPER slow. This is the same technology used by submarines to communicate in the depths of our oceans.
To support this frequency you need a long antennae. By long I mean 1km long. Some SES installments wrap a 1KM long antennae around and around in their customer’s ceiling. Their communication is only one way. And the crazier thing is…they are currently the market leader. This is only because they got a head start in this market. They started in 1992. They recently acquired Imagotag which is another way of saying “our solution is completely out dated”. Imagotag instantly gets bumped down for using the 2.4GHz frequency as a solution. They say they use channels unoccupied by Wi-Fi and bluetooth. I believe they said they are using channels 2,3,4,6 in the 2.4GHz. But we all know that Wi-Fi is not strictly bound to those channels. There is going to be significant interference in my opinion and range from a physics point of view is not going to be as good as a sub-GHz solution.


Headquarters: Sweden

Pricer uses infrared technology to communicate to their tags. They have a tricky installment in the ceiling of their deployments. The hardware looks hideous and quite distracting if the retailer’s ceiling is low. The infrared communication is not reliable.If a customer happens to be standing in front of the tag during the update – then it will not be successful. The good thing about their solution is that update speed should be quite fast. Range in a setting that is completed unoccupied  with all the lights off should be pretty good.

A huge problem is the security of infrared. It can be easily hacked as demonstrated by by viral video on youtube which shows how you can use a Game-boy to change the prices on an infrared ESL. Yikes.

ESL for Industrial Sector

There is a rapid adoption of ESL in the industrial sector to replace the 40-year-old process of manually placing paper labels on the literally millions of containers, carts, and sub-assemblies flowing through factories every day with simple, cost-effective wireless displays

Industrial ESL provide the reliability and visual instruction inherent with paper labels along with automated tracking.


1. Ubiik

Headquarters: Japan

The key to adoption in the Industrial space is working with existing wireless infrastructure. Ubiik has managed to make ESL compatible with all off-the-shelf UHF RFID readers. The high adoption rate of this product in factories all over Asia places Ubiik at the forefront of ESL for the industrial sector.

Ubiik also has E-Paper that can be updated via NFC (android smartphones or any off-the-shelf NFC reader). And SUPER long range ePaper with over 1km update range.


2. Omni ID

Headquarters:Rochester, NY

In 2012, Omni-ID launched ProVIEW — the world’s first visual tagging system — to replace paper-driven processes in manufacturing, providing not only the ability to track assets; but dynamic, readable instructions right on the tag, completely changing the auto-identification industry landscape. The ProView markets itself as RFID compatible E-Paper but after taking a deep dive, we realised that OMNI ID actually uses a proprietary protocol to transmit to the ProView tag. Therefore, factories will need to install Omni ID’s proprietary hardware/base station to update the displays much like the ESL in the retail space.
Omni-ID rfid tags. 3 sizes showing various information.

3. Mpicosys

Headquarters: New York, NY

Mpicosys offers a variety of customised E-Paper signage. MpicoSys has developed the PicoSign displays and enables special devices, in fact answering any requirement and questions one can have on the use of ePaper displays. One of the best examples is the PicoSign Wall at United Nations headquarters in New York.


4 Main ‘Must Haves’ for the Physical Layer of Internet of Things Wireless Connectivity


Analysis of the physical layer of wireless communication solutions for IoT application.

For IoT applications, the main characteristics of the physical layer that need to be considered are modulation, data rate, transmission mode, and channel encoding.

Modulation. The nature of IoT applications, some involve infrequent data transmission that need low-cost low-complexity devices, preclude the use of high-order modulation or advanced channel coding like trellis-coded modulation. Unless mandatory, due to a harsh radio environment with narrowband interferers or regulatory constraints, spread spectrum, e.g., Direct Sequence Spread Spectrum (DSSS), is to be avoided as it increases the channel bandwidth, requiring a more costly and power-consuming RF frontend, with no data rate improvement. Allowing non-coherent demodulation relaxes the constraint on the device complexity, so (Gaussian) Frequency Shift Keying ((G)FSK) is a proven and suitable choice, similarly as in Bluetooth radio. It is considered that the most sensible choice upon availability would be Gaussian Minimum Shift Keying (GMSK), as the modulation index of ½ allows for lower complexity, or better sensitivity at a given complexity. When available bandwidth is restricted, GFSK with lower modulation index is still appropriate, with the next best being 1/3 as it still allows for near-optimal demodulation at reasonable complexity.

Data rate. IoT applications need to mix very low data rate requirements, e.g., a sensor or an actuator with limited data size either uplink or downlink, with more demanding requirements, e.g., a 6-inch 3-color ePaper display in a home that updates the daily weather forecast or the shopping list, easily amounting to more than 196 kB worth of data. Yet even for small data amounts, a carefully chosen higher data rate actually improves power-consumption thanks to shorter transmission time and reduced probability of collision. Similar reasoning is applied to Bluetooth Low Energy, a.k.a., BLE or Bluetooth Smart, formerly Nokia’s WiBree, which uses 1 Mbps with much lower data throughput. The latter is aimed at proximity communication and its high gross data rate of 1 Mbps sacrifices the range considerably. Even when operating at sub-GHz frequencies, which offer better range than 2.4 GHz for a given transmit power, the 1 Mbps is considered to be the absolute upper limit. On the higher end, the transceiver complexity and power increase do not improve the actual useable throughput, as the overhead of packet acknowledgement and packet processing time become the bottleneck.

On the lower end, data rates below 40 kbps are actually impractical, as it would rule out using standard off-the-shelf 20 parts per million (ppm) crystals. Indeed, the frequency accuracy of these crystals is not sufficient: 20 ppm translates into a 18 kHz frequency error when operating in sub-GHz bands, while it is 48 kHz when operating at 2.4GHz. A narrow channel requires an accurate crystal like temperature-compensated TCXO on both ends, including the client, which is more costly, power-consuming, and bulky [36].The optimal baseline gross data rate is considered to be 500 kbps. Depending on the scale of the network, e.g., home, building, district, or city, the applications, and the number of devices, we expect different trade-offs with actual deployments ranging from 100 kbps to 500 kbps.

Transmission mode. Full duplex communication is challenging, as it requires good isolation and does not allow for resource sharing between transmit and receive. Full duplex also typically involves different frequencies for downlink and uplink. Since the radio resource is a scarce resource, half-duplex is therefore selected, preferably on the same radio channel.

Channel coding. There is the potential for improving link quality and performance with a limited complexity increase by using (adaptive) channel coding together with Automatic Repeat-Request (ARQ) retry mechanism. As of today, this is considered optional due to complexity-cost-performance trade-offs achieved with current technologies. However, provisions have to be made for future implementation. As of today, flexible packet length is considered a sufficient means of adapting to the link quality variations.

Media access control layer and network topology

For IoT applications, the main characteristics of the media access layer control (MAC) that need to be considered are multiple access, synchronization, and network topology.

Multiple Access. Looking back at decades of successful cellular system deployment, one can safely conclude that TDMA is a good fit for the IoT. TDMA is suited for low-power operation with a decent number of devices, as it allows for optimal scheduling of inactive periods. Hence, TDMA is selected for multiple access in the MAC layer.

Synchronization. In IoT applications, there are potentially a very large number of power-sensitive devices with moderate throughput requirements. In such a configuration, it is essential to maintain a reasonably consistent time base across the entire network and potentially across different networks. Given that throughput is not the most critical requirement, it is suitable to follow a beacon-enabled approach, with a flexible beacon period to accommodate different types of services.

Network topology. Mobile networks using a cellular topology have efficiently been servicing a large number of devices with a high level of security and reliability, e.g., 5,000+ per base station for LTE in urban areas. This typology is based on a star topology in each cell, while the cells are connected in a hierarchical tree in the network backhaul. This approach is regarded suitable for the IoT and is therefore selected.

The network layer and interface to applications

The network layer (NWK) and the interface to applications are less fundamental as far as power-efficiency and reliability is concerned. In addition, there is more variation in the field of IoT applications. Nevertheless, it is widely acknowledged that IoT applications need to support the Internet Protocol (IP), whether it is IPv4 or IPv6. In addition, the User Datagram Protocol (UDP) and Constrained Application Protocol (CoAP) could provide the relevant trade-off between flexibility and implementation-complexity on resource-constrained devices.

Furthermore, the IoT will represent an immense security challenge, and it is likely that state-of-the-art security features will become necessary. As of today, we can assume 128 bits Advanced Encryption Standard (AES) for encryption and Diffie-Hellman (DH), or the Elliptic Curve Diffie-Hellman (ECDH) variants, can become the baseline for securing communication.

Internet of Things Connectivity Option: Cellular Network Technologies


Review of Existing Cellular Network Technologies: The Pros and Cons

With all the shortcomings in the incumbent technologies discussed above, one would be surprised by the absence of the most widely used and proven communication technologies by far: cellular system. Indeed, current cellular technologies manage to fulfill some of the requirements for ‘good’ IoT networks, most notably the coexistence of many simultaneously connected devices, absence of interference, high reliability, long range, and capable to service both low-data rate latency-sensitive and high-data rate applications on the same infrastructure. However, current cellular technologies have characteristics that rule them out for most of the emerging IoT applications. This section presents the review of the most prominent existing cellular technologies.

  • 2G (GSM / GPRS / EDGE): 2G is power efficient thanks to its Time Division Multiple Access (TDMA) nature and narrowband 200 kHz channel bandwidth, relatively low-cost, and very long range especially in its 900 MHz band. 2G is not actively maintained and developed anymore, and there should be the possibility of re-farming or even re-auctioning the frequency bands, potentially for IoT technologies.
  • 3G (UMTS / WCDMA / HSPA): 3G is power hungry by design due to continuous and simultaneous (full duplex) receive and transmit using Code Division Multiple Access (CDMA) that has proven to be less power-efficient than TDMA, wide 5MHz channel bandwidth to achieve high data rates (Wideband CDMA), and high complexity especially for dual-mode 2G/3G. WCDMA is not quite suitable for IoT. Even for cellular, WCDMA has evolved back from CDMA to time-slotted High Speed Packet Access (HSPA) for higher data rates, and even pre-allocated timeslots for lower power consumption in HSPA+. In addition, its Frequency Duplex means it has dedicated spectrum for uplink and downlink, such that it is best suitable for symmetric traffic, which is not typical for IoT clients. It is well-known that the battery-life is characteristically shorter when operating in 3G mode compared to 2G mode, either in idle state or during a low data rate, around 12 kbps, voice call.
  • 3G (CDMA2000 1xRTT, 1x EV-DO (Evolution-Data Only)): As an evolution from the first CDMA technology IS-95/cdmaOne developed by Qualcomm shares most of the fundamental characteristics with WCDMA, although with a narrower channel bandwidth of 1.25 MHz.
  • Chinese 3G (UMTS-TDD, TD-SCDMA): Time Division Synchronous Code Division Multiple Access (TD-SCDMA) was developed in the People’s Republic of China by the Chinese Academy of Telecommunications Technology, Datang Telecom, and Siemens AG, primarily as a way to avoid patent and license fees associated with other 3G technologies. As a late coming 3G technology with a single license granted to China Mobile and deployment only starting in 2009, TD-SCDMA is not widely adopted, and will most likely never be (as it will be deprecated by LTE deployments). TD-SCDMA differs from WCDMA in the following ways. First, TD-SCDMA relies on Time Division Synchronous CDMA with 1.6 MHz channel bandwidth (1.28 Mcps). Second, TD-SCDMA uses Time Duplex with dedicated uplink and downlink time-slots. Third, TD-SCDMA uses a narrower channel bandwidth. Fourth, TD-SCDMA has a synchronous network as all base stations sharing a time base. Fifth, TD-SCDMA provides lower data rates than WCDMA, but its time-slotted nature provides better power-efficiency, along with less complexity. Sixth, TD-SCDMA can outperform GSM battery-life in idle state, and can perform similarly in voice call, which is significantly better than WCDMA. Finally, as opposed to WCDMA, TD-SCDMA requires neither continuous nor simultaneous transmit and receive, allowing for simpler system design and lower hardware complexity / cost. These differences actually make TD-SCDMA more suitable than WCDMA for asymmetric traffic and dense/urban areas. Although TD-SCDMA is still too power-hungry to cover the most constrained IoT use cases, it could be considered the most suitable existing cellular technology for IoT.
  • 4G (LTE): 4G is more power-efficient than 3G, has reduced complexity thanks to its data-only architecture (no voice support), and its limited backward compatibility with 2G/3G. It uses Orthogonal Frequency Division Multiple Access (OFDMA) physical layer in a wide channel bandwidth, typically 20 MHz, for delivering high data rates, 150 Mbps and more with MIMO. Interestingly, the requirements for the IoT have been acknowledged and some standardization efforts are aimed at Machine-to-Machine (M2M) lower-complexity and lower-cost. Most notably LTE Release 12 Cat-0 introduces Machine-Type Communication (MTC), which allows for a narrower 1.4 MHz channel bandwidth and lower peak data rate of 1 Mbps with extended sleep modes for lower-power. Release 13 is studying the feasibility of reducing the channel bandwidth further down to 200 kHz with peak data rate down to 200 kbps, with operation in more sub-GHz frequency bands. Release 12 is foreseen to be commercially available in 2017, and Release 13 in 2018 or later [31].

One of the main drawbacks of cellular is the battery consumption and the cost of the hardware. The closest cellular solution to IoT is the Intel XMM 6255 3G Modem, the self-proclaimed world’s smallest 3G modem. The Intel XMM 6255 3G Modem is claiming an area of 300 mm² in 40 nm process (high density at the expense of higher cost and higher leakage, i.e. power consumption in sleep). Power consumption figures are 65 uA when powered off, 900 uA in both 2G / 3G idle state (with unspecified sleep cycle duration) and 580 mA in HSDPA transfer state, with a supply voltage of 3.3-4.4V (nominal 3.8V) . As a matter of comparison, a typical IEEE 802.15.4 / ZigBee SoC using 180 nm process comes in a 7 x 7 mm (49 mm²) QFN40 package with a sleep current below 5 uA and active receive / transmit under 30 mA, with a supply voltage between 2 V and 3.3 V. When normalizing to the same process, there is a 100-fold increase in area from ZigBee to cellular, which relates to the complexity of the receiver and protocol, and translates into a much higher cost and power consumption. This underlines that, although cellular-type protocols could be very suitable for IoT, existing cellular technologies are way too cumbersome and are overkill.

Another drawback of existing cellular technologies is that they operate on licensed frequency bands. This means that a licensee holder needs to manage the radio resource, e.g., a network operator that charges users high rates in order to pay for the expensive spectrum licenses. With the rise of IoT in the coming years, however, we cannot assume that the network operators will stand still. In addition, the regulatory bodies might re-assess the regulatory framework of frequency allocations.

In short, existing cellular network technologies have many characteristics that make them suitable for IoT applications. However, they suffer from the drawback of putting too much pressure on the power consumption of resource-constrained devices. In addition, they operate on scarce and expensive frequency bands. The next section presents a detailed discussion that leverages the beneficial characteristics and addresses the drawbacks of cellular technologies to define the design requirements that make cellular suitable for IoT applications.

Internet of Things Connectivity Option Analysis: IEEE 802.15.4 technologies


Originally released in 2003, IEEE 802.15.4 defines a physical layer (PHY) and media access control layer (MAC) on top of which others can build different network and application layers. The most well-known are ZigBee and 6LoWPAN. IEEE 802.15.4 defines operation in the 2.4 GHz band using DDSS to alleviate narrowband interferences, realizing a data rate of 250 kbps. However, IEEE 802.15.4 has a chip rate of 2 Mbps due to spreading. IEEE 802.15.4 also defines operation in sub-GHz bands, but has failed to take full advantage of these frequency bands: IEEE 802.15.4 specification only defines very low GFSK data rates, 20 kbps and 40 kbps, in these sub-GHz bands, and only allows a single channel in the European 868 MHz band (868.0 -868.6 MHz). These restrictions make the 2.4 GHz variants of IEEE 802.15.4 more attractive, accounting for their wider adoption to date.

IEEE 802.15.4g amendment entitled “Amendment 3: Physical Layer (PHY) Specifications for Low-Data-Rate, Wireless, Smart Metering Utility Networks”, was approved in March 2012. IEEE 802.15.4g improves on the low data rates by enabling the usage of more sub-GHz frequency bands, e.g. 169.4-169.475 MHz and 863-870 MHz in Europe, 450-470 MHz in the US, 470-510 MHz and 779-787 MHz in China, and 917-923.5MHz in Korea. In addition, IEEE 802.15.4g introduces Multi Rate FSK (MR-FSK), OQPSK, and OFDM physical layers, all applicable to these sub-GHz bands. Nevertheless, MR-FSK can still only achieve 200 kbps in the 863-870 MHz band in Europe by using Filtered 4FSK modulation. Higher data rates require MR-OFDM, which may prove inappropriately complex for low-cost and low-power devices. With these new physical layers also come additional complexity from the support of more advanced Forward Error Correction (FEC) schemes, and backward compatibility hassle, as supporting the previous FSK and OQPSK physical layers is mandated. Despite the sensible technical considerations that are generally well-suited for powered device such as smart grid and utilities, there is limited availability of 802.15.4g-enabled chipsets. Consequently, IEEE 802.15.4g will take some time for IEEE 802.15.4g to evolve and to grow before it can be proven as a viable option for the IoT.

The most common flavor of IEEE 802.15.4 operating in the 2.4 GHz provides limited range due to fundamental radio theory as mentioned earlier, and is further degraded by the environment. Moisture affects 2.4 GHz propagation significantly (this is why microwave ovens also operate at 2.4 GHz to be specifically well-absorbed by water), and any obstruction, such as a wall, door, or window, would attenuate 2.4 GHz signals more than 1 GHz.

This may be worked around by using multi-hop communication via special relay devices. These relays cannot be regular battery-powered devices since it implies continuous receiving. Literature states that such multi-hop approach increases overall power-consumption. IEEE 802.15.4 is often claimed to be a mesh topology to compensate for the limited radio coverage and reliability. Yet in practice, this is still a hybrid topology because only some particular AC-powered relays can provide relaying. Resource-constrained end devices would still see the network as a star topology.

As can be seen from some studies, multi-hop / mesh topology could be considered a future trend. However, the current single-radio approaches are not suitable for multi-hop and mesh. If relays and devices share the same medium for communication, then a mesh topology is not an efficient solution, as there cannot be multiple devices communicating simultaneously.

Moreover, it  has to be acknowledged that efficiently managing a large number of clients, ensuring their connectivity, and balancing the data flow in a star or tree topology network are already challenging enough not to add an unnecessary overhead of a multi-hop mesh solution.

Finally, IEEE 802.15.4 has not been designed to handle coexistence with other collocated IEEE 802.15.4 networks, or for device mobility. These limitations will prove to be a real problem when the number of connected devices grows dramatically in future IoT applications. Simply imagine the scenario when the nearby apartments within a same building install a compliant IEEE 802.15.4 IoT network and connected objects. IEEE 802.15.4 is not able to handle this situation. Until a solution is found to coordinate with the nearby IEEE 802.15.4 network, IEEE 802.15.4 is not a viable option for the IoT. This holds true for IEEE 802.15.4-based technologies, ZigBee and 6loWPAN, as well as BLE or Z-Wave, which have no provision for this kind of scenario as well.

Internet of Things Wireless Connectivity Option Analysis: Pros and Cons of Bluetooth Classic, Bluetooth Low Energy, and CSRmesh


Analysis of the major Bluetooth technologies, including Bluetooth Classic, Bluetooth Low Energy, and CSRmesh as solution for the last 100m of IoT connectivity.

Bluetooth Classic

Bluetooth Classic, also standardized as IEEE 802.15.1 in 2002 and revised in 2005 (although this standard is not maintained anymore), was invented in 1994 as a replacement for RS-232. Bluetooth Classic operates in the 2.4 GHz band and is limited to a small number of eight devices. Because of the following reasons, Bluetooth Classic is not a suitable protocol for IoT applications:

  • Bluetooth Classic was designed to provide low-latency wireless peripherals and has evolved to provide high data rates. This is achieved at the expense of power consumption.
  • The physical layer (PHY) of Bluetooth Classic only supports long packets (up to 2745 bits of payload) with mandatory channel encoding. This enables higher throughput, however, this is not suitable for resource-constrained devices.
  • The protocol stack of Bluetooth Classic has grown in complexity and can typically be 128 kB of code size, which is not satisfactory for IoT embedded devices.
  • Bluetooth Classic’s loose specification on the modulation index range does not make it easy to improve the receiver performance in the future. Consequently, Bluetooth Classic has poor coverage, typically less than 10 m.
  • With a 3-bit address for piconet space, Bluetooth Classic is limited to having a maximum size of 8 connected devices, which is obviously insufficient for IoT applications.

Bluetooth Low Energy (BLE)

BLE also known as Bluetooth v4.0 or Bluetooth Smart originated from Nokia’s WiBree. Contrary to belief, BLE is actually not compatible with Bluetooth Classic since the physical layer (PHY) has been re-designed. BLE is using a fixed data rate of 1 Mbps and GMSK modulation. BLE uses short packets, and is suitable for low-latency proximity communication. Unfortunately, BLE has the following issues that make it less suitable for IoT applications:

  • BLE is operating in the crowded 2.4 GHz frequency band, along with Bluetooth Classic, Wi-Fi, ZigBee, and IEEE 802.15.4. This spectrum crowding will pose a severe reliability challenge to all 2.4 GHz devices, and the problem will only get worse when the number of connected object increases.
  • BLE is optimized for low-latency sporadic transmissions and therefore its efficiency degrades dramatically for larger data transfers. With its maximum of 20 bytes application payload size per packet, the gross 1 Mbps data rate of BLE translates into a theoretical maximum transfer rate of 250 kbps, and in practice the actual transfer rates drops below 100 kbps. This opposed to Bluetooth Classic v1.2 that achieves 700 kbps, and v2.1 + EDR reaches 2 Mbps actual transfer rate. An actual transfer rate of only 1/10 of the gross data rate is rather lackluster and translates into poor power-efficiency for such type of data traffic. Although many IoT applications may have a limited data amount to transfer, e.g., for switching off or changing the color of a light bulb, others would still require slightly larger transfers. As a result, BLE is not suitable for IoT applications that require higher data transfers.
  • BLE has limited range and extending the network therefore requires a hybrid topology where some client nodes act as server nodes for other star networks. In Bluetooth-specific terminology, this is called scatternet, which yields high network complexity in real deployments. For instance, BLE is essentially asynchronous, such that this hybrid topology (mix of star and mesh) causes increased interference and increased power consumption, even inside a single network.
  • Finally, BLE suffers from interference from USB 3.0, and poses a challenge when operating with collocated LTE or WIMAX networks. This is reflected in Bluetooth SIG filtering recommendations. However, workarounds are developed as well.


In February 2014, CSR plc, formerly Cambridge Silicon Radio, announced the availability of their proprietary CSRmesh software. CSRmesh operates over Bluetooth Low Energy (BLE) with the aim to enable mesh topology over the restrictive BLE scatternet topology and to provide direct communication between BLE devices. However, we want to note the following:

  • The main advantage of CSRmesh is to allow smartphone connectivity. It is still questionable whether this connectivity should be achieved via direct connection to any device or more simply via a gateway or routers, e.g., Wi-Fi or BLE-enabled routers, or even through cellular if a device is out of range.
  • Turning BLE into a mesh-able protocol is not that straightforward. Even if BLE in itself is power-efficient for low duty cycle and small data packets, enabling the mesh functionality would require each device to simultaneously be an observer and broadcaster. This implies that each device would continuously listen for advertising packets, and would then switch to advertising the received data for some period.
  • The inefficient use of the radio resources inherent to continuous receive would make it difficult to achieve ultra-low-power consumption in resource-constrained devices. As reported on CSR Forums, there happened to be a current consumption in idle state of around 3mA, which is 100x more than people would expect for a battery powered IoT device. In short, the asynchronous nature of BLE, optimized for low duty cycle / sporadic transmission, seems to offer a challenge for the implementation of a power efficient mesh topology on top of the exiting BLE protocol stack.
  • Allowing direct smartphone connection to every device may not provide additional functions. On the contrary, as discussed above it will drain the battery of the device. In addition, it is a potential security threat because there is no gateway with sufficient computing power to filter access and enable strong authentication security.

questions / comments? fire away!

Internet of Things Wireless Connectivity Option: Wi-Fi Pros and Cons

Today one of the most common connectivity technologies for consumer products is Wi-Fi, whose 802.11b/g flavors are using the license-free 2.4 GHz frequency band. A Wi-Fi access point (or hotspot) has a range of about 20 meters (66 feet) indoors and a slightly greater range outdoors. Wi-Fi has the benefit of a large spectrum allowing high data rates, 54 Mbps and still increasing with 802.11n and 802.11ac, in a license-free frequency band that is almost harmonized worldwide1 [17]. Wi-Fi has been widely adopted and it is a great way to provide wireless broadband internet access. However, Wi-Fi is designed for high data rates needed for multimedia contents, as opposed to many IoT applications. There has been some effort to promote low-power Wi-Fi; however, it remains an order of magnitude hungrier than what battery-powered devices in IoT application can afford such as battery powered sensors. In short, Wi-Fi is not a suitable candidate for many IoT applications. It is overkill on the data rate for most applications and an absolute power guzzler.  Wi-Fi is likely to remain the major smartphone and Internet connectivity medium. One can envision that the IoT network gateway would be embedded in the Wi-Fi hub already present in most homes, commercial spaces, factories, and offices.

Connectivity Options for the Internet of Things (IoT)

First, it was M2M – Machine to Machine communication a technology that was suppose to revolutionize our world, but never really took off in a big way due to the cost of embedding cellular connectivity in the end devices. Now M2M has been replaced with IoT – Internet of Things, a better sounding term and hopefully with smarter and cheaper options to connect random things to the Internet. Everyone has their own idea of what the Internet of Things (IoT) is, but one thing is certain that it will increasingly become important in our lives given the ever decreasing cost of wearable devices, sensors, and other monitoring equipment.

It is important to separate over optimistic ignorant hype from actual reality of the technologies. Any software / hardware engineer who ever developed anything useful will tell you that it is easy to define a use case, but a lot harder to actually build a system.

Our aim is describe technology enablers for IoT, especially communication technologies and protocols that will be used for building IoT applications.

Applications for IoT Abound

Despite a lot of vague use cases cited in popular press (and many seem out of science fiction books rather than based on understanding of technology), IoT can be applied to three broad areas:

– Consumer Homes and Personal Networks

– Consumer in his/her Automotive Vehicle

– Industrial and Office Applications

Homes are easy frontiers to deal with for IoT. A typical American home contains home appliances, entertainments systems, temperature and humidly controls units. It is easy to see how a user will like to be able to manage some or all of these devices using his wearable or handheld devices. Most use cases are easy to enumerate by using a general paradigm of control X using Unit Y.

Automotive sector is already integrating all kind of sensors in the car and creating various enablers like Advanced Driver Assistance Systems (ADAS). ADAS can warn drivers from low tire pressure to dangers ahead using a combination of communication and data processing technologies.

Industrial applications for IoT are still in infancy. Most existing M2M applications will be moved to the IoT categories be it the data collection in the supply truck or the manufacturing floor. The amount and type of such applications is only limited by human imagination and the ability of engineers to create them.

Today, these mentioned segments use wireless technologies and Internet interaction, but typically they each focus on what is common within their industry. The chosen wireless solution needs to adequately address the industries’ concerns regarding connectivity options, robust operation, and security features.

Communication Options

The Internet of Things (IoT) is built on an underlying multi-protocol communications framework that can easily move data between embedded “things” and systems located at higher levels of the IoT hierarchy. For designers and application developers, a diverse set of wireless and wired connectivity options provides the glue that holds IoT together.

All IoT sensors require some means of relaying data to the outside world. There’s a plethora of short-range, or local area, wireless technologies available, including: RFID, NFC, Wi-Fi, Bluetooth (including Bluetooth Low Energy), XBee, ZigBee, Z-Wave and Wireless M-Bus. There’s no shortage of wired links either, including Ethernet, HomePlug, HomePNA, HomeGrid/ and LonWorks.

Selecting the best network option for an IoT device, however, requires a careful look at various factors for each situation.

  • The Scale and Size of the IoT Network
  • Data Throughput or Transfer Requirements,
  • The eventual physical location of the embedded devices, the battery size and physical size etc.

Micro-controllers that provide the heart of most embedded or wearable devices already have certain input output integrated. Today, there is a big choice of good, inexpensive, programmer-friendly devices with nice peripherals, low power consumption, and good cross-platform support. You can get cheap Arduino or Raspberry boards just for under 10 dollars.

Just like Micro-controllers, designers do not lack options for wireless connectivity and ICs able to support them. While ANT, Bluetooth®, WiFi and ZigBee may number among the more familiar alternatives, viable wireless connectivity solutions have coalesced around standards including 6LoWPAN, DASH6, EnOcean, Insteon and Z-Wave, among many others. At the same time, smart designers can use proprietary RF approaches. However, for remote and highly mobile applications cellular broadband with LTE or other Wireless connectivity is the only option.

For Wired Devices, Ethernet Rules Supreme

The Internet of Things (IoT) implies connectivity, and developers have lots of wired and wireless options at their disposal to make it happen. Ethernet tends to dominate the wired realm. IoT frameworks map higher-level protocols on this type of connectivity, but the devices don’t work until they have a method of communication with the network.

At this point, Ethernet implementations range from 10 Mb/s up to 100 Gb/s. Of course, the high end generally targets the backbone of the Internet to link server farms in the cloud, while the low to mid-range runs on the rest of the devices. The median implementation these days is 1-Gb/s Ethernet. A new class of Ethernet speeds looms on the horizon. Essentially, 1-Gb/s Ethernet is bumping up to 2.5 Gb/s with a corresponding hop up for higher-speed Ethernet like 10 Gb/s moving to 25 Gb/s. This change essentially provides faster throughput using the same cabling.

Other less common networking possibilities exist on both the wired and wireless side, but are worth mentioning. For example, the HomePlug Alliance’s Powerline networking uses power connections to power the interface as well as a transmission medium. A host of interoperable products include devices such as wireless access points and bridges to Ethernet.

IoT Wireless Technology Selection

Here it really gets interesting. There are several proprietary wireless solutions used in every segment as well as standards including 6LoWPAN, ANT+, Bluetooth, Bluetooth low energy, DECT, EDGE, GPRS, IrDA, LTE, NFC, RFID, Weightless, WLAN (also commonly referred to as Wi-Fi), ZigBee, Z-Wave, and others. We can briefly examine the merits of each.

Wi-Fi When You Need Big Bandwidth

Wi-Fi, with its array of 802.11 variants, provides the highest throughput of wireless technologies at this point. New emerging 802.11ac uses the 2.5- and 5-GHz bands with a combined bandwidth of 5.3 Gb/s. Indoor range is on the order of 100 to 200 feet. The next evolution—802.11ax—is poised to succeed 802.11ac.

A key challenge for IoT developers surrounds power requirements. WiFi communication technology requires far more power than some other technologies. Hence, WiFi option may have to be limited only to devices such as mobile phone, tablets or where it may be possible to deliver wired power like home mounted temperature control sensors and security system components.

Wi-Fi for more power-limited budgets is possible but will have to add techniques to preserve battery lives. For example, a device can send a burst of data at pre-determined intervals and then get to sleep mode.

Bluetooth Classic and Bluetooth-Low Energy (LE)

Bluetooth is a short-range technology utilizing the 2.4- to 2.485-GHz ISM (industrial, scientific, and medical) band. The Bluetooth Special Interest Group manages the technology, with the latest standard being Bluetooth 4.2.

Until smart phones came with media players, Bluetooth was at the verge of almost dying but since then it has come to be embedded in numerous devices. Bluetooth has “classic” and Low Energy (LE) versions; the 4.x standard allows both or either to be implemented. BT- “classic” and BT-Smart/LE aren’t backward-compatible and very different technologies except for the name.

BT-LE is designed to allow for devices that run and communicate for months or years using low-power sources like button cell batteries or energy-harvesting devices. Classic and Smart Bluetooth maximum range is about 100 m (330 feet), while data rate is up to 3 Mbs/s and 1 Mb/s, respectively. However, actual application throughput, like most wireless technologies, is less—2.1 Mb/s for classic and 0.27 Mb/s for Smart.

A new feature in BT-LE is Bluetooth beacons that permit a transfer of information such as device availability, coupons etc at certain intervals. It can be very useful for IoT apps.

ZigBee – Sensor Networking with Scalable Mesh Routing

This is my favorite technology. You can get ZigBee modules cheaply for a few cents, and integrate in any device. It barely uses any battery, runs for a year on a simply battery, and is good for sending periodic sensor data. It can be used for everything from embedded sensors, medical profiling and, naturally home automation processes.

ZigBee is a wireless technology developed as an open global standard to address the unique needs of low-cost, low-power wireless M2M networks. The ZigBee standard operates on the IEEE 802.15.4 physical radio specification and operates in unlicensed bands including 2.4 GHz, 900 MHz and 868 MHz.

A key component of the ZigBee protocol is the ability to support mesh networking of up to 65,000 nodes. In a mesh network, nodes are interconnected with other nodes so that multiple pathways connect each node. Connections between nodes are dynamically updated and optimized through sophisticated, built-in mesh routing table.

Other Low Energy Wireless Options – Zwave, 6LowPan, MiWi, ANT etc.

Just like ZigBee, there are other options some proprietary, some developed by a group of vendors and some coming through other standardization bodies that sit on top of IEEE 802.15.4 physical radio specifications or have their own proprietary radio layers.

Zwave, supported by the Z-Wave Alliance, is another competing technology to Zigbee for home automation projects. Like ZigBee it too supports mesh networking, but is protocol is proprietary. ZigBee chipsets are produced by several silicon vendors, while Z-Wave ones come only from one manufacturer, Sigma Designs.

Z-Wave uses the Part 15 unlicensed ISM band. It operates at 908.42 MHz in the U.S. and Canada but uses other frequencies in other countries depending on their regulations Performance characteristics are similar to 802.15.4, including 100-kb/s throughput and a 100-ft. (30.5 m) range.

In addition, a number of vendor-specific protocols are built on 802.15.4, such as Microchip’s MiWi, which are often lighter weight and have fewer licensing restrictions.

6LoWPAN is a low-power wireless mesh network where every node has its own IPv6 address, allowing it to connect directly to the Internet using open standards. Since, each node has its own IP address all other IP routing protocols can be used.

ANT is an open access multicast wireless sensor network technology designed and marketed by ANT Wireless, now part of Garmin, featuring a wireless communications protocol stack that enables semiconductor radios operating in the 2.4 GHz band (“ISM band”) to communicate. ANT is characterized by low computational overhead resulting in low power consumption by the radios supporting the protocol and enabling low power wireless embedded devices that can operate on a single coin-cell battery from months to years..

In short, 6LoWPAN, ZigBee, ZWAve, MiWi, ANT are all competing for the same space.

Cellular Network Options Are Still Available

Most cellular IoT devices aim to use Long Term Evolution (LTE) 4G and 5G standards. Cellular technology has the advantage of coverage and availability in the large areas. For some devices mounted in the moving trains, trucks, roadside emergency devices, or cars this may be the only viable option.

LTE and LTE-advanced both provide excellent bandwidth throughputs. LTE provides almost like 300 MBits/sec. 4G LTE-Advanced will provide 1 Gb/s, while 5G promises 10 Gb/s.

The major problem is the recurring cost of cellular connectivity since cellular operation requires plans from service providers.

Device Selection Criteria for IoT Designers

IoT is about creating a most efficient, application specific network of connected devices. Connected devices all share five key components:

  • The need for smarter power consumption, data storage, and network management;
  • The need for stronger safeguards for privacy and security;
  • The need high-performance micro-controllers (MCUs); sensors and actuators; and
  • The ability to communicate without losing information.

To narrow down the list of options, compare the technologies from the following IoT key needs:

  • Cost efficiency: Most IoT devices are of low cost, and need affordable radio solutions. So, performance and cost balance are very important.
  • Small size. IoT devices are typically of small size, the radio technology with all its Antenna, battery etc need to physically fit in the housing of the sensor device.
  • Secure Communication. Security of communication is needed. Authentication and data encryption must be supported by the chosen wireless technology. Also, it should be possible to build end to end secure applications.
  • Low power consumption. Since most IoT devices operate on batteries or energy harvesting technologies, the radio technology must have ultra-low power consumption.
  • Strong Available Ecosystem. For any device selection you will need to examine its ecosystems since interoperability with other devices will be important.
  • High Reliability under Noisy Conditions. IoT devices will operate in less than perfect conditions. Hence wireless technology must be able to deal with signal noise, interference and other environmental conditions.
  • Easy to Use. It is possible to leave configurations to experts in the industrial settings, but for consumers ease of plug and play is needed.
  • Radio Range extension capability. Though IoT operates in short distances, it is important that the chosen technology can offer enough range coverage or have some range extension capabilities.

Matching the Design to the Target Market

Despite the bewildering list of connectivity options, system designers find that the best option for a particular IoT device. A design is often constrained based on application needs, performance requirements and environmental limitations. The need for compatibility in established markets may also affect the best connectivity choice.

The good part is that if you are a hardware or embedded system designer, the choices of components is plentiful.

You can find a diverse set of relate hardware solutions including modules and ICs for ANT connectivity from vendors including Nordic Semiconductor, Panasonic and Texas Instruments; ZigBee solutions from Atmel, Freescale and Microchip; and Bluetooth/BLE solutions from CSR, RFM and STMicroelectronics, 6LowPAN devices from TI, STMicroElectronics, Sensinode, Atmel etc.

If you are designing IoT devices or wants to create iOT software and need individual consulting, feel free to connect with me.

Great Write Up about Pebble and Apple Iwatch

Pebble vs. Apple: David and Goliath This Ain’tapple-watch-6_1


By this time next week Apple will have, once again, sucked all the oxygen out of the room. Next Monday, at one of the company’s time-tested high-profile events, we’ll all be attending the coming out party for Apple Watch.

But this week, the smart watch news is all about Pebble, which can reasonably claim to have energized the space three years ago in a very Apple way: Exploding onto the scene with a breakthrough device someone else thought of first.

Pebble returned to Kickstarter last week in a bald attempt to capitalize on the smart watch buzz created by Apple’s imminent entry into the space with Pebble Time, a sportier model with a new approach to notifications it calls Timeline. They’ve promised a month of news, timed to the 30-day campaign, which includes today’s reveal of — surprise! — an upgrade option to Pebble Time Steel, a steal at only $80 more than the (long since taken) $170 batch (Yes, I’m in. Again).

Pebble and Apple isn’t David and Goliath, at least not as far as Pebble CEO Eric Migicovsky is concerned. “Whether delusional, manically focused or simply well-rehearsed, Migicovsky chose to view the Apple announcement as a plus for Pebble,” Steven Levy writes in Backchannel. ‘It’s pretty incredible to see the world’s largest company come into the watch space,’ he said. ‘It’s validating something I’ve known for the last six and a half years — that the next generation of computing will be on your body.'”

What is undeniably true is that Pebble has sold more than one million watches in three years, and six days into a 30-day Kickstarter campaign, has sold another $14 million worth. With that, the company has re-claimed the title (it first took with the original Pebble) as the most funded Kickstarter project ever.

So, there is that.

I first took notice of Pebble in my Reuters column when they broke all records on their first Kickstarter campaign, in April 2012:

A Kickstarter project for a device you wear on your wrist, but that needs a smartphone to do anything really interesting, has raised more than $5.3 million in eight days. This is this far and away the most anyone has ever raised on Kickstarter, and it’s happening – with a gadget in a category that has a pretty dismal track record – at a sales pace that would make even Apple sit up and take notice.

As much as I like to dine out on those last words, I’m not really sure Apple did “sit up take notice” as much as it might have already been working on the idea for quite some time.

The smart watch has all the earmarks of the sort of device-that-time-forgot Apple often manages to turn into something relevant. Microsoft had tried and failed with it a decade before the first Pebble (note the similarities to the tablet, which Apple reinvented a decade after the Redmond giant tried to market its own). Various kinds of smart watch have been around ever since, getting little love. Even Pebble was going nowhere fast as a developer of a device tethered to Blackberry phones, which were about to fall off a cliff.

What changed? Two very important, intertwined things.

Smart watches were originally conceived of as stand-alone devices. The limitations are now pretty obvious, chiefly the tiny screen. Remember, though, at the time ofMicrosoft’s SPOT, screens on mobile phones were also pretty tiny.

But they didn’t do all that much. Unlike the Dick Tracy device people of a certain age remember fondly you couldn’t even talk to anyone with it. I mean, we KNEW that watches were communications devices in the early 1960s. So why aren’t they in the year 2002?!

Apple went a long way towards setting the stage for the emergence of the smart phone as must-have mobile device in 2007, with the first iPhone. Among the new features was a ginormous screen, which made activities like web surfing credible on a mobile device. So successful was the smart phone that it created a new version of a problem futurist Alvin Toffler had identified in 1970: information overload. Hard core techies, like Gigaom’s Mathew Ingram, would soon argue that you should choose a smart phone based on how well they wrangled notifications above all other features.

And that was the new opening for the resurgence of the smart watch. The trick, from my perspective, is to avoid mission creep. It is to remember that the opportunity lies in extending the utility of the smart phone, not replacing it.

But the existential question about whether smart watches are a mainstream consumer item is valid. Notification management is pretty hard core. One new use case: There are unique health monitoring opportunities for something strapped to your wrist. Pebble steals a little of that thunder today — surprise! — with a reveal of the smartstrap, which can “contain electronics and sensors to interface directly with apps running on Pebble Time.” That is another open invitation to developers, who have already flocked to the Pebble platform in very respectable numbers — 26,000 have written 6,000 apps.

Apple may bury Pebble, or its entry into the smart watch space might lift all boats — even Android, whose fans will tell you already boasts a range of excellent choices with features Apple will reinvent, or steal, depending on your point of view.

So, for a smart watch aficionado these are exciting times. If Apple is wildly successful, look to them to even extend coverage to Android devices, like iTunes spread to Windows. Apple’s entry is a make-or-break event which will answer whether there is a massive, pent-up hunger for this kind of device, or whether it’s only a play thing for people like me.

Either way, it’s about time.