SOTAVerified

Serverless inferencing on Kubernetes

2020-07-14Unverified0· sign in to hype

Clive Cox, Dan Sun, Ellis Tarn, Animesh Singh, Rakesh Kelkar, David Goodwin

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Organisations are increasingly putting machine learning models into production at scale. The increasing popularity of serverless scale-to-zero paradigms presents an opportunity for deploying machine learning models to help mitigate infrastructure costs when many models may not be in continuous use. We will discuss the KFServing project which builds on the KNative serverless paradigm to provide a serverless machine learning inference solution that allows a consistent and simple interface for data scientists to deploy their models. We will show how it solves the challenges of autoscaling GPU based inference and discuss some of the lessons learnt from using it in production.

Tasks

Reproductions