← Back to the schedule

Machine Learning on The Edge

Calendar icon

Thursday 15th

Time icon

15:20 | 16:00

Location icon

Theatre 19


Keywords defining the session:

- Machine Learning

- IoT

- Edge computing

Takeaway points of the session:

- Understand the power of intelligent cloud plus intelligent edge working together.

- Understand how machine learning models are run on premises but trained on the cloud.


In a few years, the world will be filled with billions of small, connected, intelligent devices. Many of these devices will be embedded in our factories, our cities, our vehicles, and even our homes. The proliferation of small computing devices will disrupt every industrial sector and play a key role in the next evolution of computing.
Most of these devices will be small and mobile. Many of them will have limited memories and weak processors. Almost all of them will use a variety of sensors to monitor their surroundings and interact with their users. Most importantly, many of these devices will rely on machine-learned models to interpret the signals from their sensors, to make accurate inferences and predictions about their environment, and, ultimately, to make intelligent decisions. Offloading this intelligence to the cloud is often impractical, due to latency, bandwidth, privacy, reliability, and connectivity issues. Therefore, we need to execute a significant portion of the intelligent pipeline on the edge devices themselves.
Modern state-of-the-art machine learning techniques are not always the best fit for execution on small, resource-impoverished devices. Today’s machine learning algorithms are designed to run on powerful servers, which are often accelerated with special GPU and FPGA hardware.
Edge intelligence as found in embedded devices is typically supplemented with additional intelligence in the cloud. Therefore, the importance of developing algorithms that dynamically decide when to invoke the intelligence in the cloud and how to arbitrate between predictions and inferences made in the cloud and those made on the device.
Another aspect of the challenge has to do with making algorithms accessible to non-experts. Most of the intelligent devices of the future will be invented by innovators who don’t have formal training in machine learning or statistical signal processing. These innovators are part of an exponentially growing group of entrepreneurs, tech enthusiasts, hobbyists, tinkerers, and makers. Developing world-best edge intelligence algorithms is only half the battle— working to make these algorithms accessible and usable by their intended target audience is the other half.
Additionally, in this session we will explain the rationale behind running those Machine Learning models that are created and trained in the cloud in the closest possible location of where the data is generated.
Come to this session to understand the power of Machine Learning on the edge.