Are you on an internal ML platform team? We'd love to know if self-service (for data scientists), multi-tenant model -> container image builds are a pain point for you.<p>I was lucky enough to be involved in the creation of <a href="https://chassis.ml/" rel="nofollow">https://chassis.ml/</a> - which we're open sourcing today! Basically, it solves the problem of "how do you get your MLflow models into KFServing" - and in the process solved some interesting problems around doing rootless image builds in a multi-tenant K8s cluster.<p>In other words Chassis is an API server which muxes between MLflow models, k8s jobs + kaniko, KFServing + Modzy APIs, with a Python SDK on top for data scientists to drive it.<p>Demo: <a href="https://www.youtube.com/watch?v=d_8OIfQOa3I" rel="nofollow">https://www.youtube.com/watch?v=d_8OIfQOa3I</a><p>Test drive! <a href="https://chassis.ml/#test-drive" rel="nofollow">https://chassis.ml/#test-drive</a><p>More broadly, we're working on a standard for making model serving more portable, so you can "build once, run many" i.e. build container images that run in a variety of different platforms, to avoid lock-in: <a href="https://openmodel.ml/" rel="nofollow">https://openmodel.ml/</a>