From the early 2000s onwards, mobile carriers were starting to use the label ‘4G’ to describe a number of technologies and services. The industry needed clarity on what it meant.
In 2008, the International Telecommunications Union (ITU) set out its standards for 4G connectivity, which all services had to meet if they were going to describe themselves as 4G. For mobile, this meant that connection speeds needed to have a peak of at least 100 megabits per second, and for stationary uses, at least 1 gigabit per second. Back in 2008, these speeds were not yet practically achievable; they were intended more as a target for developers to aim for.
At that time, a technology called WiMax was a real contender to become the dominant 4G connectivity solution. Sometimes dubbed “Wi-Fi on steroids”, WiMax operated on the same principles as very long-range Wi-Fi, with the potential to replace both traditional landlines and mobile internet.
For various practical and costs reasons however, most carriers decided not to invest in whole new WiMax networks. Instead, they opted to adopt LTE, which essentially involved an upgrade to existing network technology, rather than whole new infrastructure.
The first iteration of LTE was not created with IoT usage in mind. Power consumption was around 50% greater than 3G technologies. It also uses many more frequency bands than 3G and 2G, which may mean you need multiple modems to deploy it globally.
However, over the last few years, LTE has given rise to evolved technologies designed for industrial, commercial and consumer-focused IoT projects:
- LTE Cat M or LTE-M (Long-Term Evolution for Machines)
- LTE Cat NB or NB-IoT (Narrowband IoT)
- LTE Cat 1
Both LTE-M and NB-IoT require upgraded network infrastructure whereas LTE Cat 1 does not. LTE Cat 1 can operate on the original 4G LTE networks.