Understanding today's IoT technology becomes more clear when you learn how it was developed over the decades. Since the early eighties, the IoT industry has gone through major milestones that define the wide range of ways it can empower a business. Here are some of the most important milestones in IoT history.
It Started with ARPANET
Electronic networking began with the Advanced Research Projects Agency Network (ARPANET), a computer networking project of the United States Department of Defense's ARPA (aka DARPA). The ideas for the first network of interconnected nodes were developed in 1966 by J.C.R. Licklider and Bob Taylor. They integrated their ideas with those of Donald Davies on packet switching to formulate a plan for computers to exchange data.
The first computers of this network were connected in 1969 at UCLA and Stanford Research Institute. Soon other nodes were added at University of California, Santa Barbara and University of Utah School of Computing. ARPANET was declared operational in 1975 and expanded in the early eighties. It was decommissioned in 1990 after the formation of partnerships among telecom and tech companies led to the commercial development of the internet.
Explore the state of IoT and how it is used with other advanced technologies during our Summit of Things on Oct. 27. Register below.
The First Connected Soda Machine
In 1982, students at Carnegie Mellon University developed the first soda machine to connect with the ARPANET. This experiment inspired a wave of inventors to embed IoT technology that connects with the internet into their products.
The Coke machine was set up with six columns of bottles that each had indicator lights that would turn red when a column was empty. The students installed a board that tracked the status of each light. The board was connected with the ARPANET, which only served less than 300 computers then. The system also tracked how many minutes bottles had been in the machine since restocking. Anyone on the ARPANET could access the machine's data. It saved students from a long walk when the machine was empty.
Launch of the TCP/IP Protocol
The adoption of the TCP/IP Protocol in 1983 marked the beginning of an ARPANET standard that continues for the internet through the present. The previous year, it had been declared a military standard by the Department of Defense. This set of standards defines how data is packetized, transmitted and received. These standards are maintained by the Internet Engineering Task Force (IETF).
The TCP/IP Protocol was considered much more powerful and flexible than the earlier delivery system used to route data through the ARPANET. This transition was an important shift to what became known as the internet. The two main developers of the new system were Robert E. Kahn and Vinton Cerf, who had both also developed the existing NCP protocol. The new system broke data down into packets that were sent through routers then reassembled at the recipient's location. Data processing occurs at the TCP level then packets are transfered to the IP layer, which includes destination port numbers and IP addresses.
An early autonomous navigation project was at Carnegie Mellon University (CMU) in 1984 with a project called "NavLab". The goal was to create a machine with autonomous navigation controlled by computer vision. It was funded by DARPA, which aimed at building an Autonomous Land Vehicle (ALV).
At the time, the technology called for using indoor robots attached with cables. The first self-contained vehicle was the NavLab 1, developed in 1986 from a Chevy van, which cost $1 million. The follow-up model was designed to be an army ambulance, reaching a speed of 70 mph on rough terrain. These early models lacked sensor sensitivity and other modern features. Several engineers of these early models migrated to Google and other manufacturers working on self-driving cars. The Navlab project at CMU still continues to this day. The latest vehicle is Navlab 11.
Rise of the World Wide Web
Tim Berners-Lee at CERN began to envision the World Wide Web in 1989. It was built on a packet switching network that soon integrated with satellite-based Global Positioning System (GPS) technology. Berners-Lee opened this system to the public in 1991. At MIT in 1994, he went on to found the World Wide Web Consortium (W3C), which is now the main international standards organization for the web.
The W3C was supported by DARPA and the European Commission. Within a few years of inception, the W3C expanded to offices around the world. Today, the W3C partners with hundreds of organizations to develop protocols and guidelines for the web, including HTML updates.
The popularity of the WWW exploded in the mid-nineties, fueling the dotcom boom. Soon every business needed a website and needed to communicate electronicially through email. These changes alone were significant, even if businesses overpaid web designers for simple web pages. It marked the beginning of online storefronts such as Amazon.
Early IoT Devices
The first IoT device is often considered to be a remote control toaster made by John Romkey in 1990. The toaster could be turned on or off over the internet, as the toaster connected with TCP/IP networking. Within a few years webcams started to appear such as the coffee-monitoring Trojan Room Coffee Pot at University of Cambridge in 1993.
The concept of smart home devices began to surface in 2005 with the Nabaztag, an early virtual assistant with an electronic voice, which communicated with homeowners on stocks and weather. Many tech historians consider 2008 to be the beginning of the modern IoT era. That's when the number of connected computers exceeded the number of people on the planet, according to Cisco IBSG.
The term "The Internet of Things" began to appear on Gartner's emerging technology lists in 2011. Starting in 2013, thermostats and home lighting began to incorporate IoT sensors, allowing homeowners to control temperature and lights from their smartphones.
Introduction of Wearables
Some people might credit the first wearable as an abacus ring worn by Queen Elizabeth I of England in the 16th century. But as far as electronic devices dialed into the internet, Steve Mann designed a backpack mounted to a computer in 1981. In 1994, Mann demonstrated an internet-connected camera that was able to capture images to a computer from the internet, ushering in the wearable webcam era.
As the internet grew in popularity in the nineties, several other internet-connected devices began to appear on the market. In 1996, DARPA hosted a "wearables in 2005 workshop" to give consumers a peek into the future. Then in 1997, Carnegie Mellon University, MIT and Georgia Tech collaborated on hosting the IEEE International Symposium on Wearables Computers (ISWC), unveiling papers on new sensors and hardware capable of connecting online. Mann invented and introduced the first smartwatch in 1998.
Wearables have become widespread in the healthcare industry and have helped save many lives. It had become such a large market for IoT designers that by 2018 the term "Internet of Medical Things (IoMT)" had emerged. This class includes devices that monitor heart rate and blood glucose levels. It's a fast-growing market, projected to reach $176 billion by 2026.
Coining of the Term "IoT"
The term "Internet of Things" was coined in 1999 by Kevin Ashton, who was the Executive Director of Auto-ID Labs at MIT. He used the term during his product demonstration for Procter and Gamble. Ashton combined the ideas of RFID and the internet. His development of a network of academic research labs led to the RFID-based identification system called Electronic Product Code (EPC).
The term "IoT" started gaining media attention by 2004. But the popularity of the term took about a decade to blossom. In 2010, usage of the term began to pick up with Google's StreetView service, but it really wasn't until 2014 that there was a mass awareness of the term. That's when Google announced it was acquiring Nest for $3.2 billion.
Another sign IoT was going mainstream was its inclusion at the annual Las Vegas-based Consumer Electronics Show (CES) in 2014. As knowledge grew of the term, similar terms such as "Industrial Internet of Things (IIoT)" gained widespread use in tech publications. People now commonly use the term "IoT" when referring to smart energy meters, home automation and wearable devices. Any physical objects embedded with sensors and are linked through wireless networks are considered IoT devices.
After the development of Bluetooth 1.0 in 1999, the new century would begin to see an explosion of internet-enabled appliances. In 2000, LG announced the world's first internet-enabled refrigerator, the Internet Digital DIOS. Even though it wasn't a big hit, it's remembered as the beginning of internet-enabled appliances. One of the reasons it didn't sell well was its high price.
Soon many other internet-connected appliances began to appear on the market. The Sony Ericcson T36 was the first mobile phone to integrate with Bluetooth in 2001. As the press began to use the term "IoT" more in 2004, depicting warehouses full of hundreds of wireless sensors, interest and demand for IoT devices began to grow. During this time awareness of RFID also escalated.
In 2008, the IoT boom began. By 2009, Bluetooth 3.0 allowed for high speed transfers over Wi-Fi and Google was testing self-driving cars. In 2010, a company composed of former Apple engineers called "Nest" began manufacturing smart home appliances. Its first product was a thermostat capable of learning user habits to develop a heating schedule. Today Google Nest makes smart speakers, smoke detectors, security systems and various other smart devices.
In 2016, the Third-Generation Partnership Project (3GPP) introduced NB-IoT for standard wireless communication. It's a narrow band standard that was developed the previous year by telecom companies using licensed frequency bands to avoid radio interference. Not only can NB-IoT operate over long distances, but it also performs much better in urban and dense areas. Common applications for NB-IoT now include smart wearables, street lighting and road traffic monitoring.
NB-IoT is designed to facilitate indoor coverage at a low cost with high connection density. It's particularly useful for IoT devices that require frequent communication. Another function of NB-IoT is that it makes the power consumption of user devices more efficient. Today, the standard is classified as a 5G technology and is supported by all major mobile equipment manufacturers, working with 2G, 3G and 4G mobile networks. Large campuses can benefit from the combination of NB-IoT and 5G.
These key milestones in IoT technology helped chart the course of business networking technology through today. The more businesses want to learn how to make their operations more efficient, the more demand for IoT will increase.