Internet of Things Connectivity Option: Cellular Network Technologies

400px-Frequency_reuse.svg

Review of Existing Cellular Network Technologies: The Pros and Cons

With all the shortcomings in the incumbent technologies discussed above, one would be surprised by the absence of the most widely used and proven communication technologies by far: cellular system. Indeed, current cellular technologies manage to fulfill some of the requirements for ‘good’ IoT networks, most notably the coexistence of many simultaneously connected devices, absence of interference, high reliability, long range, and capable to service both low-data rate latency-sensitive and high-data rate applications on the same infrastructure. However, current cellular technologies have characteristics that rule them out for most of the emerging IoT applications. This section presents the review of the most prominent existing cellular technologies.

  • 2G (GSM / GPRS / EDGE): 2G is power efficient thanks to its Time Division Multiple Access (TDMA) nature and narrowband 200 kHz channel bandwidth, relatively low-cost, and very long range especially in its 900 MHz band. 2G is not actively maintained and developed anymore, and there should be the possibility of re-farming or even re-auctioning the frequency bands, potentially for IoT technologies.
  • 3G (UMTS / WCDMA / HSPA): 3G is power hungry by design due to continuous and simultaneous (full duplex) receive and transmit using Code Division Multiple Access (CDMA) that has proven to be less power-efficient than TDMA, wide 5MHz channel bandwidth to achieve high data rates (Wideband CDMA), and high complexity especially for dual-mode 2G/3G. WCDMA is not quite suitable for IoT. Even for cellular, WCDMA has evolved back from CDMA to time-slotted High Speed Packet Access (HSPA) for higher data rates, and even pre-allocated timeslots for lower power consumption in HSPA+. In addition, its Frequency Duplex means it has dedicated spectrum for uplink and downlink, such that it is best suitable for symmetric traffic, which is not typical for IoT clients. It is well-known that the battery-life is characteristically shorter when operating in 3G mode compared to 2G mode, either in idle state or during a low data rate, around 12 kbps, voice call.
  • 3G (CDMA2000 1xRTT, 1x EV-DO (Evolution-Data Only)): As an evolution from the first CDMA technology IS-95/cdmaOne developed by Qualcomm shares most of the fundamental characteristics with WCDMA, although with a narrower channel bandwidth of 1.25 MHz.
  • Chinese 3G (UMTS-TDD, TD-SCDMA): Time Division Synchronous Code Division Multiple Access (TD-SCDMA) was developed in the People’s Republic of China by the Chinese Academy of Telecommunications Technology, Datang Telecom, and Siemens AG, primarily as a way to avoid patent and license fees associated with other 3G technologies. As a late coming 3G technology with a single license granted to China Mobile and deployment only starting in 2009, TD-SCDMA is not widely adopted, and will most likely never be (as it will be deprecated by LTE deployments). TD-SCDMA differs from WCDMA in the following ways. First, TD-SCDMA relies on Time Division Synchronous CDMA with 1.6 MHz channel bandwidth (1.28 Mcps). Second, TD-SCDMA uses Time Duplex with dedicated uplink and downlink time-slots. Third, TD-SCDMA uses a narrower channel bandwidth. Fourth, TD-SCDMA has a synchronous network as all base stations sharing a time base. Fifth, TD-SCDMA provides lower data rates than WCDMA, but its time-slotted nature provides better power-efficiency, along with less complexity. Sixth, TD-SCDMA can outperform GSM battery-life in idle state, and can perform similarly in voice call, which is significantly better than WCDMA. Finally, as opposed to WCDMA, TD-SCDMA requires neither continuous nor simultaneous transmit and receive, allowing for simpler system design and lower hardware complexity / cost. These differences actually make TD-SCDMA more suitable than WCDMA for asymmetric traffic and dense/urban areas. Although TD-SCDMA is still too power-hungry to cover the most constrained IoT use cases, it could be considered the most suitable existing cellular technology for IoT.
  • 4G (LTE): 4G is more power-efficient than 3G, has reduced complexity thanks to its data-only architecture (no voice support), and its limited backward compatibility with 2G/3G. It uses Orthogonal Frequency Division Multiple Access (OFDMA) physical layer in a wide channel bandwidth, typically 20 MHz, for delivering high data rates, 150 Mbps and more with MIMO. Interestingly, the requirements for the IoT have been acknowledged and some standardization efforts are aimed at Machine-to-Machine (M2M) lower-complexity and lower-cost. Most notably LTE Release 12 Cat-0 introduces Machine-Type Communication (MTC), which allows for a narrower 1.4 MHz channel bandwidth and lower peak data rate of 1 Mbps with extended sleep modes for lower-power. Release 13 is studying the feasibility of reducing the channel bandwidth further down to 200 kHz with peak data rate down to 200 kbps, with operation in more sub-GHz frequency bands. Release 12 is foreseen to be commercially available in 2017, and Release 13 in 2018 or later [31].

One of the main drawbacks of cellular is the battery consumption and the cost of the hardware. The closest cellular solution to IoT is the Intel XMM 6255 3G Modem, the self-proclaimed world’s smallest 3G modem. The Intel XMM 6255 3G Modem is claiming an area of 300 mm² in 40 nm process (high density at the expense of higher cost and higher leakage, i.e. power consumption in sleep). Power consumption figures are 65 uA when powered off, 900 uA in both 2G / 3G idle state (with unspecified sleep cycle duration) and 580 mA in HSDPA transfer state, with a supply voltage of 3.3-4.4V (nominal 3.8V) . As a matter of comparison, a typical IEEE 802.15.4 / ZigBee SoC using 180 nm process comes in a 7 x 7 mm (49 mm²) QFN40 package with a sleep current below 5 uA and active receive / transmit under 30 mA, with a supply voltage between 2 V and 3.3 V. When normalizing to the same process, there is a 100-fold increase in area from ZigBee to cellular, which relates to the complexity of the receiver and protocol, and translates into a much higher cost and power consumption. This underlines that, although cellular-type protocols could be very suitable for IoT, existing cellular technologies are way too cumbersome and are overkill.

Another drawback of existing cellular technologies is that they operate on licensed frequency bands. This means that a licensee holder needs to manage the radio resource, e.g., a network operator that charges users high rates in order to pay for the expensive spectrum licenses. With the rise of IoT in the coming years, however, we cannot assume that the network operators will stand still. In addition, the regulatory bodies might re-assess the regulatory framework of frequency allocations.

In short, existing cellular network technologies have many characteristics that make them suitable for IoT applications. However, they suffer from the drawback of putting too much pressure on the power consumption of resource-constrained devices. In addition, they operate on scarce and expensive frequency bands. The next section presents a detailed discussion that leverages the beneficial characteristics and addresses the drawbacks of cellular technologies to define the design requirements that make cellular suitable for IoT applications.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s