We live in a world where even our day-to-day activities like checking the weather, scrolling through social media, or taking photos depend on machine learning (ML) models. Traditionally, ML models operate on the cloud. The technology is often used to conduct data management and data processing on the relevant data gathered from Internet of Things (IoT) devices connected to the cloud network.
However, this traditional ML model and IoT ecosystem have their own drawbacks and problems, which should be resolved soon to transform our world into a better-connected place. Let’s take a look at the challenges posed by traditional cloud-based ML models and how ‘tiny’ machine learning (tinyML) can help resolve them.
Data Privacy and Security
The sensitive data collected and transmitted by IoT devices over a cloud network for further data processing is prone to cyberattacks and other privacy and security issues. According to IBM’s Cost of Data Breach report 2020, the average cost of a data breach is estimated at $3.86 million.
The enormous size of machine learning algorithms requires a tremendous amount of energy for deep learning purposes. For instance, the GPT-3 algorithm released in May 2020 boasts a network architecture packed with 175 billion neurons, twice the number present in the human brain. In addition, this ML model costs around $10 million for deep learning purposes and uses nearly 3 GWh of electricity. In 2019, a research team at the University of Massachusetts estimated that deep learning sessions of a single ML model could generate up to 626,155 pounds of CO2 emissions, an approximate carbon emission of five cars over their lifetime.
When the user instructs an IoT device, the device must transmit that instruction to the server. After data processing, the server must send the information back to the device. Latency defines the time lag in sending and receiving the data between an IoT device and the network. In slower networks, the latency turns higher, and it is undesirable since an IoT device aims to engage users at higher efficiency and speed.
A multi-billion-dollar opportunity lies in deploying machine learning applications at the edge. Grandview Research estimates that the edge computing market would be worth $61.14 Billion by 2028 at a compound annual growth rate (CAGR) of 38.4%.
What is TinyML?
Tiny machine learning, or simply, TinyML, is a rapidly growing field of study in machine learning and embedded systems where machine learning models ‘tiny’ enough to operate at the edge have been produced and deployed. In other words, TinyML focuses on implementing ML models on handy, low-powered devices like microcontrollers or microprocessors. Therefore, it enables low-latency networking, low-power consumption, and low-bandwidth model inference at edge devices.
How TinyML Redefines Networking
The development of machine learning models has been associated with larger, complex neural networks and faster processing speeds responsible for increasing power consumption and carbon footprints.
Let’s explore how TinyML can redefine networking, data management, and data processing without compromising its accuracy or reliability.
Better energy efficiency
Generally, a standard central processing unit (CPU) consumes power between 65 and 85 watts, and a standard graphics processing unit (GPUs) takes up to 200 to 500 watts of power. However, as mentioned earlier, TinyML operates on microcontrollers or microprocessors that consume energy in terms of milliwatts or microwatts, a thousand times less power consumption than GPUs and CPUs. As a result, TinyML devices that operate ML applications on edge, with this low power consumption, can run on batteries for weeks or even months.
This better energy efficiency of TinyML devices also brings down carbon emissions, a byproduct of the traditional ML process.
As mentioned earlier, latency is the time lag between the data transmission and receiving process of an IoT device and a network. If the IoT device has its own data processing power, it doesn’t need to send every instruction to the cloud for further data management, data processing, and execution. As the dependency of the IoT device on the network reduces, so does the overall latency of the IoT device and the network.
Improved data privacy and security
When data transmission, data management, and data processing happen on a network, the level of vulnerability of the data increases. TinyML imparts data processing and data management power to IoT devices. It can eliminate the data vulnerability to greater lengths as the sensitive data gets managed and processed in an IoT device rather than in the cloud. Securing an IoT device is much easier and cheaper than providing security features to the whole network. Therefore, TinyML improves data privacy and ward off cybersecurity issues to great lengths.
Efficient data collection
Networking, data management, data processing, and data analytics systems tend to generate an enormous amount of data, large parts of which are redundant. IoT devices empowered by TinyML technology can be programmed to collect only the relevant data.
For instance, consider a CCTV camera installed inside an 11-story building. If the camera continues recording and storing footage for 24 hours a day, you can imagine the size of the data storage space required on the server, even though the footage is automatically deleted from the storage space periodically.
A TinyML enabled CCTV camera records only the footage when an unusual event or movement is detected inside the building. This can reduce the large amount of data generated and bring down the requirement for larger data storage and data processing.
Also read: Best Machine Learning Software in 2021
The Road Ahead for TinyML
TinyML is one of the major technological advancements in the field of artificial intelligence (AI) and machine learning that acts as the bridge between edge computing and smart IoT devices, promising to make them even faster, efficient, and affordable. In the future, TinyML will open greater possibilities for applications in our day-to-day consumer devices, such as TVs, cars, washing machines, watches, among the few so that they have intelligent functionalities that have been reserved to computers and smartphones today. Moreover, it can also help reduce financial, environmental, and security issues associated with traditional ML models.
In March 2019, the inaugural TinyML Summit conducted by TinyML Foundation revealed strong interest from the community with the active participation of senior management and technical experts from 90 companies.
The Summit revealed a few promising facts:
- First, TinyML enabled devices are becoming a crucial element for many futuristic commercial applications, and the rise of new data architectures is imminent.
- Second, significant technological advancement has been made on algorithms, networks, and models that need storage of only 100kB and below.
- Third, low-power TinyML applications are on the way to dominate the audio-visual sphere.
A growing momentum has been demonstrated by TinyML’s technical progress and ecosystem development. Tech giants such as Qualcomm, ARM, Google, and Microsoft attempt to bring AI inference to the very edge of networks and implement them on sensors.