TALK

← Back to the schedule

Theory And Practice Of Distributed Training With Tensorflow

Calendar icon

Wednesday 14th

Time icon

12:35 | 13:15

Location icon

Theatre 20

Technology

Keywords defining the session:

- Tensorflow

- GCP

- Distributed Training

Takeaway points of the session:

- PFA helps solve a major pain point in taking machine learning to production, particularly within the Apache Spark community.

- Open standards enable true model portability across languages, frameworks and runtimes.

Description:

Since TensorFlow uses model distribution as a default distribution option the entry barrier in order to start doing distribution training is high. Our talk specifically targets to cover all the main pain points and main gap in a knowledge that Deep Learning practitioners usually have. After the talks Deep Learning practitioners should have a good understanding why TensorFlow using model distribution more often, how to change model distribution to data distribution and what are the potential implications of such changes.

MEDIA

Keynote