TensorFlow Serving with Docker for Model Deployment

4.9
звезд
Оценки: 24
от партнера
Coursera Project Network
3,211 уже зарегистрированы
В этом Проект с консультациями вы:

Train and export TensorFlow Models for text classification

Serve and deploy models with TensorFlow Serving and Docker

Perform model inference with gRPC and REST endpoints

Clock1.5 hours
IntermediateУчащийся среднего уровня
CloudЗагрузка не требуется
VideoВидео на разделенном экране
Comment DotsАнглийский
LaptopТолько для ПК

This is a hands-on, guided project on deploying deep learning models using TensorFlow Serving with Docker. In this 1.5 hour long project, you will train and export TensorFlow models for text classification, learn how to deploy models with TF Serving and Docker in 90 seconds, and build simple gRPC and REST-based clients in Python for model inference. With the worldwide adoption of machine learning and AI by organizations, it is becoming increasingly important for data scientists and machine learning engineers to know how to deploy models to production. While DevOps groups are fantastic at scaling applications, they are not the experts in ML ecosystems such as TensorFlow and PyTorch. This guided project gives learners a solid, real-world foundation of pushing your TensorFlow models from development to production in no time! Prerequisites: In order to successfully complete this project, you should be familiar with Python, and have prior experience with building models with Keras or TensorFlow. Note: This course works best for learners who are based in the North America region. We’re currently working on providing the same experience in other regions.

Навыки, которые вы получите

Deep LearningDockerTensorFlow ServingTensorflowmodel deployment

Будете учиться пошагово

На видео, которое откроется рядом с рабочей областью, преподаватель объяснит эти шаги:

  1. Introduction and Demo Deployment

  2. Load and Preprocess the Amazon Fine Foods Review Data

  3. Build Text Classification Model using Keras and TensorFlow Hub

  4. Define Training Procedure

  5. Train and Export Model as Protobuf

  6. Test Model

  7. TensorFlow Serving with Docker

  8. Setup a REST Client to Perform Model Predictions

  9. Setup a gRPC Client to Perform Model Predictions

  10. Versioning with TensorFlow Serving

Как устроены проекты с консультациями

Ваше рабочее пространство — это облачный рабочий стол в браузере. Ничего не нужно загружать.

На разделенном экране видео преподаватель предоставляет пошаговые

Часто задаваемые вопросы

Часто задаваемые вопросы

Остались вопросы? Посетите Центр поддержки учащихся.