January 11

Major Milestones in IoT Technology History

Understanding today's IoT technology becomes clearer when you learn how it was developed over the decades. Since the early eighties, the IoT industry has gone through major milestones that define the wide range of ways it can empower a business. Here are some of the most important milestones in IoT history.

summit of things

Explore the state of IoT and how it is used with other advanced technologies by purchasing a ticket for the Summit of Things.

Nikola Tesla’s Inventions

Nikola Tesla was a Serbian American inventor, electrical engineer, and physicist who is best known for his contributions to the design of the modern alternating current (AC) electricity supply system. But he also made several major contributions to the field of radio communication.

He patented the Tesla coil - a high-voltage, high-frequency transformer that is used in radio technology and other applications – in 1891. And in the same year, he demonstrated wireless transmission using the Tesla coil. With his newly created Tesla coils, Tesla soon discovered that he could transmit and receive radio signals when they were tuned to resonate at the same frequency.

Tesla designed an early radio transmitter and receiver, which he demonstrated in 1893. This was one of the first demonstrations of wireless communication. He also presented the fundamentals of radio in 1893 during his public presentation, "On Light and Other High-Frequency Phenomena."

In 1897, Tesla conducted a series of experiments in which he transmitted radio signals over a distance of several miles. This was one of the first demonstrations of long-range radio communication.

An article from pbs.org discloses a timeline of Tesla’s revolutionary demonstration for its radio patent. It mentions that at the Electrical Exhibition of 1898, in Madison Square Garden, Tesla staged a scientific tour de force, a demonstration completely beyond the generally accepted limits of technology. His invention, covered in patent No. 613,809 (1898), took the form of a radio-controlled boat, a heavy, low-lying, steel craft about four feet long. 

Inasmuch as radio hadn't been officially patented yet (Tesla's basic radio patent was filed in September 1897, but granted in March 1900), examiners from the US Patent Office were reluctant to recognize improbable claims made in the application "Method of and Apparatus for Controlling Mechanism of Moving Vessels or Vehicles." Confronted with a working model, however, examiners quickly issued the approval. Tesla's radio-controlled boat was an early precursor of modern drones and other remotely controlled vehicles.

Finally, in his 1926 interview with John B Kennedy, Tesla basically predicted the internet and smartphones.

"When wireless is perfectly applied the whole earth will be converted into a huge brain, which in fact it is, all things being particles of a real and rhythmic whole. We shall be able to communicate with one another instantly, irrespective of distance. Not only this, but through television and telephony we shall see and hear one another as perfectly as though we were face to face, despite intervening distances of thousands of miles; and the instruments through which we shall be able to do this will be amazingly simple compared with our present telephone. A man will be able to carry one in his vest pocket."


Radio-frequency identification (RFID) uses radio-frequency technology to identify and track objects. Connecting an RFID reader to the Internet allows us to identify, track and monitor the objects attached with tags globally, automatically, and in real-time. The foundation of modern RFID was developed during and after WW2.

In the decades before WW2, radar had taken a huge step forward as Dr. A. Hoyt Taylor of the U. S. Naval Research Laboratory experimented with high-frequency radio waves at the confluence of the Anacostia and Potomac Rivers. Soon after, the convergence of radar and broadcast technology formed the precursor to RFID. And in 1948 Harry Stockman published “Communication by Means of Reflected Power.” This paper, which many consider the first paper on RFID, highlighted the possibility of point-to-point communication using radio waves.

During the second world war, a military application of this technology was developed to identify enemy aircraft; the long-range transponder systems known as identification, friend, or foe (IFF). The first IFF transponders were introduced after the Battle of Britain. Some experts regard this as the original RFID application.

The 1960s saw the emergence of several companies devoted to RFID technology. Sensormatic and Checkpoint, for example, were founded in the late 60s. In an effort to limit theft, these companies developed tracking solutions we take for granted today — electronic article surveillance (EAS). Featuring tags attached to merchandise, this system can sound the alarm if unpaid merchandise is removed from the store or taken into the restroom.

The 1970s delivered an explosion of academic progress on RFID. In 1973 Mario W. Cardulla filed a patent for a rewritable active RFID tag, and in the same year, Charles Walton filed a patent for a passive RFID tag. The term RFID was, officially invented in 1983 by Charles Walton when he filed the first patent with the word 'RFID'.

It wasn't until the 1980s, however, that RFID began to be used more widely. During this time, RFID was used in transportation and logistics to track packages and containers, and it was also used in manufacturing to track parts and components. Since then, RFID has become increasingly prevalent, and it is now used in a wide range of applications, including supply chain management, inventory control, asset tracking, and even in consumer products like credit cards and passports.

The Arpanet

Electronic networking began with the Advanced Research Projects Agency Network (ARPANET), a computer networking project of the United States Department of Defense's ARPA (aka DARPA). The ideas for the first network of interconnected nodes were developed in 1966 by J.C.R. Licklider and Bob Taylor. They integrated their ideas with those of Donald Davies on packet switching to formulate a plan for computers to exchange data.

The first computers of this network were connected in 1969 at UCLA and Stanford Research Institute. Soon other nodes were added at the University of California, Santa Barbara, and the University of Utah School of Computing. ARPANET was declared operational in 1975 and expanded in the early eighties. It was decommissioned in 1990 after the formation of partnerships among telecom and tech companies led to the commercial development of the internet.

The First Connected Soda Machine

In 1982, students at Carnegie Mellon University developed the first soda machine to connect with the ARPANET. This experiment inspired a wave of inventors to embed IoT technology that connects with the internet into their products.

The Coke machine was set up with six columns of bottles that each had indicator lights that would turn red when a column was empty. The students installed a board that tracked the status of each light. The board was connected with the ARPANET, which only served less than 300 computers then. The system also tracked how many minutes bottles had been in the machine since restocking. Anyone on the ARPANET could access the machine's data. It saved students from a long walk when the machine was empty.

Launch of the TCP/IP Protocol

The adoption of the TCP/IP Protocol in 1983 marked the beginning of an ARPANET standard that continues for the internet through the present. The previous year, it had been declared a military standard by the Department of Defense. This set of standards defines how data is packetized, transmitted, and received. These standards are maintained by the Internet Engineering Task Force (IETF).

The TCP/IP Protocol was considered much more powerful and flexible than the earlier delivery system used to route data through the ARPANET. This transition was an important shift to what became known as the internet. The two main developers of the new system were Robert E. Kahn and Vinton Cerf, who had both also developed the existing NCP protocol.

The new system broke data down into packets that were sent through routers and then reassembled at the recipient's location. Data processing occurs at the TCP level then packets are transferred to the IP layer, which includes destination port numbers and IP addresses.

Autonomous Navigation

An early autonomous navigation project was at Carnegie Mellon University (CMU) in 1984 with a project called "NavLab". The goal was to create a machine with autonomous navigation controlled by computer vision. It was funded by DARPA, which aimed at building an Autonomous Land Vehicle (ALV).

At the time, the technology called for using indoor robots attached to cables. The first self-contained vehicle was the NavLab 1, developed in 1986 from a Chevy van, which cost $1 million. The follow-up model was designed to be an army ambulance, reaching a speed of 70 mph on rough terrain. These early models lacked sensor sensitivity and other modern features. Several engineers of these early models migrated to Google and other manufacturers working on self-driving cars. The Navlab project at CMU still continues to this day. The latest vehicle is Navlab 11.

Rise of the World Wide Web

Tim Berners-Lee at CERN began to envision the World Wide Web in 1989. It was built on a packet-switching network that soon integrated with satellite-based Global Positioning System (GPS) technology. Berners-Lee opened this system to the public in 1991. At MIT in 1994, he went on to found the World Wide Web Consortium (W3C), which is now the main international standards organization for the web.

The W3C was supported by DARPA and the European Commission. Within a few years of inception, the W3C expanded to offices around the world. Today, the W3C partners with hundreds of organizations to develop protocols and guidelines for the web, including HTML updates.

The popularity of the WWW exploded in the mid-nineties, fueling the dotcom boom. Soon every business needed a website and needed to communicate electronically through email. These changes alone were significant, even if businesses overpaid web designers for simple web pages. It marked the beginning of online storefronts such as Amazon.

Early IoT Devices

The first IoT device is often considered to be a remote control toaster made by John Romkey in 1990. The toaster could be turned on or off over the internet, as the toaster is connected to TCP/IP networking. Within a few years, webcams started to appear such as the coffee-monitoring Trojan Room Coffee Pot at the University of Cambridge in 1993.

The concept of smart home devices began to surface in 2005 with the Nabaztag, an early virtual assistant with an electronic voice, which communicated with homeowners on stocks and weather. Many tech historians consider 2008 to be the beginning of the modern IoT era. That's when the number of connected computers exceeded the number of people on the planet, according to Cisco IBSG.

The term "The Internet of Things" began to appear on Gartner's emerging technology lists in 2011. Starting in 2013, thermostats and home lighting began to incorporate IoT sensors, allowing homeowners to control temperature and lights from their smartphones.

Introduction of Wearables

Some people might credit the first wearable as an abacus ring worn by Queen Elizabeth I of England in the 16th century. But as far as electronic devices dialed into the internet, Steve Mann designed a backpack mounted to a computer in 1981. In 1994, Mann demonstrated an internet-connected camera that was able to capture images on a computer from the internet, ushering in the wearable webcam era.

As the internet grew in popularity in the nineties, several other internet-connected devices began to appear on the market. In 1996, DARPA hosted a "wearables in 2005 workshop" to give consumers a peek into the future. Then in 1997, Carnegie Mellon University, MIT, and Georgia Tech collaborated on hosting the IEEE International Symposium on Wearables Computers (ISWC), unveiling papers on new sensors and hardware capable of connecting online. Mann invented and introduced the first smartwatch in 1998.

Wearables have become widespread in the healthcare industry and have helped save many lives. It had become such a large market for IoT designers that by 2018 the term "Internet of Medical Things (IoMT)" had emerged. This class includes devices that monitor heart rate and blood glucose levels. It's a fast-growing market, projected to reach $176 billion by 2026.

Coining of the Term "IoT"

The term "Internet of Things" was coined in 1999 by Kevin Ashton, who was the Executive Director of Auto-ID Labs at MIT. He used the term during his product demonstration for Procter and Gamble. Ashton combined the ideas of RFID and the internet. His development of a network of academic research labs led to the RFID-based identification system called Electronic Product Code (EPC).

The term "IoT" started gaining media attention in 2004. But the popularity of the term took about a decade to blossom. In 2010, usage of the term began to pick up with Google's StreetView service, but it really wasn't until 2014 that there was a mass awareness of the term. That's when Google announced it was acquiring Nest for $3.2 billion.

Another sign IoT was going mainstream was its inclusion at the annual Las Vegas-based Consumer Electronics Show (CES) in 2014. As knowledge grew of the term, similar terms such as "Industrial Internet of Things (IIoT)" gained widespread use in tech publications. People now commonly use the term "IoT" when referring to smart energy meters, home automation and wearable devices. Any physical objects embedded with sensors and linked through wireless networks are considered IoT devices.

Internet-Enabled Appliances

After the development of Bluetooth 1.0 in 1999, the new century would begin to see an explosion of internet-enabled appliances. In 2000, LG announced the world's first internet-enabled refrigerator, the Internet Digital DIOS. Even though it wasn't a big hit, it's remembered as the beginning of internet-enabled appliances. One of the reasons it didn't sell well was its high price.

Soon many other internet-connected appliances began to appear on the market. The Sony Ericsson T36 was the first mobile phone to integrate with Bluetooth in 2001. As the press began to use the term "IoT" more in 2004, depicting warehouses full of hundreds of wireless sensors, interest, and demand for IoT devices began to grow. During this time awareness of RFID also escalated.

IoT is Born

In 2008, the IoT boom began. By 2009, Bluetooth 3.0 allowed for high-speed transfers over Wi-Fi and Google was testing self-driving cars. In 2010, a company composed of former Apple engineers called "Nest" began manufacturing smart home appliances. Its first product was a thermostat capable of learning user habits to develop a heating schedule. Today Google Nest makes smart speakers, smoke detectors, security systems and various other smart devices.

Using the things/people ratio as a metric, and defining the start of the Internet of things as "simply the point in time when more 'things or objects' were connected to the Internet than people", Cisco Systems has estimated that the IoT was "born" between 2008 and 2009, with the things/people ratio growing from 0.08 in 2003 to 1.84 in 2010.


Without LPWAN the Internet of Things wouldn’t be what it is today. LPWAN (Low-Power Wide-Area Network) is a type of wireless communication network that is designed to support low-bandwidth, low-power, long-range connectivity for IoT devices and other wireless sensors. LPWAN technologies are optimized for energy efficiency and can support devices that need to run on battery power for long periods of time. They are often used in applications where it is difficult or impractical to provide power to the devices or where the devices need to be deployed in remote locations. Some examples of LPWAN technologies include LoRaWAN, Sigfox, LTE-M, and NB-IoT.

The history of LPWAN goes back to the late 1980s and early 1990s. Although at that time they were not called LPWANs. ADEMCO built AlarmNet a 900MHz network to monitor alarm panels. The network was designed for low data rates. Similarly, Motorola built ARDIS: a low-speed wireless WAN.

Two decades later, interest in LPWAN technologies re-emerged when Sigfox, which started in 2009, built the first modern LPWA network in France. And of course, at that time radio technology had become less expensive, and the tools for integrating applications had become much easier to use. As new toolsets and platforms began to emerge, it became easier to build and integrate data from remote devices into applications.

So, it is true that a lot of recent technology looks similar to the sensor-driven networks we’ve seen in the past, but one of the major differences is the newly available online integration. This integration allows for real-time monitoring insights, which has propelled LPWANs forward.

The Emergence of Satellite IoT

While satellites have been used for asset tracking and remote telemetry application since the late 1970s, satellite connectivity for Machine-to-Machine and IoT applications was always considered to be a last resort alternative to terrestrial networks, mostly due to its higher costs. But recently, there has been a renewed interest in satellite-based IoT connectivity.

Satellite IoT is used in a variety of applications, including remote monitoring, asset tracking, and emergency communications. It is often used in remote or rural areas where it is difficult to provide terrestrial connectivity, or in situations where terrestrial networks are unavailable due to natural disasters or other disruptions. The renewed interest in satellite IoT is driven by the growing demand for connectivity and communication in these remote or hard-to-reach locations, as well as the increasing adoption of IoT technology in all industries.

In addition, the development of new satellite technologies and the decreasing cost of satellite hardware and services have also contributed to the growth of satellite IoT. These developments have made it more feasible and cost-effective to use satellite technology for IoT applications, which has helped drive the adoption of satellite IoT.

In the past four years, we’ve seen the launch of various satellite-based low Earth orbit (LEO) networks. Almost two dozen start-ups and existing satellite operators have announced plans to deploy or have commercially launched satellite IoT networks that promise to deliver low-power, low-cost connectivity to IoT devices directly from space.

Unlike terrestrial networks that cover only about one-fifth of the planet’s surface, satellite networks can provide coverage to nearly all of the Earth’s surface and help accommodate the increasing IoT connectivity needs in isolated or inaccessible areas.

Hybrid connectivity, which allows IoT devices to use cellular connectivity as their primary option and switch to satellites when moving to areas with no terrestrial network coverage, will take satellite IoT to the next level. Hybrid connectivity is possible due to the emergence of single communication RF chipsets that are supporting Satcom and cellular bands.


These key milestones in IoT technology helped chart the course of business networking technology through today. The more businesses want to learn how to make their operations more efficient, the more demand for IoT will increase.


IoT Devices, IoT Technology, LPWAN, satellite IoT, Tesla

You may also like

2024 Emerging Tech Trends Redefining the Future – Pt. 3

2024 Emerging Tech Trends Redefining the Future – Pt. 3
{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}

Subscribe to our newsletter now!