SCHEDULE - TALK DETAIL


← Back to the schedule

Keynote | Technical

Shipping ML to production: From Data Scientist to End User

Thursday 16th | 17:20 - 17:50 | Theatre 19


Keywords defining the session:

- Machine Learning

- Apache Spark

- Big Data

It has been a long time since the Turing test laid down the first foundations of the philosophy of Artificial Intelligence (AI). AI however has only recently reached the mainstream market, with applications in the industrial and consumer markets. This talk will take a look at a few specific techniques and architectures that help reduce the time and complexity of iterations when using Machine Learning algorithms to deliver value to end users using Big Data. As the cost of storing data today is rather cheap, companies usually store a huge volume. Consequently, the cost of training Machine Learning models to serve clients and the public in general is high because it takes time and uses up a lot of resources due to the high amount of data. To avoid this, companies need a distributed environment and speedup techniques to reduce the time and cost of training – for example, a distributed processing framework such as Spark. But once the model has been validated, the requirements to bring it to production are different. The resources required to provide predictions to the user do not depend on the model itself, as it has already been trained, but on the business part of the company and the number of users that need to receive information. A more appropriate architecture for this purpose may be more similar to a microservice than to a distributed processing framework. Thus different solutions should be used for both stages: while Apache Spark should be used for the former, a microservice should be used for the latter. This would allow companies to easily and rapidly iterate over models to keep them updated. This is indeed another challenge: As new data enters continuously, how can the model in production be updated at the same time, including both the historic and new data gathered? Model training uses a lot of resources and is a time-consuming operation due to the large amount of data to be processed. This training process cannot be launched from scratch every time the model has to be updated with new data, so different solutions are needed – and this goes for all situations in which a same trained model can be used. For example, training a model to recognize cars in images could also then be used as a basis for recognizing other types of vehicles.

MEDIA

Keynote