Deploying Snorkel-built models
This page walks through how to deploy a Snorkel-built application, export said deployment, and stand it up in an external production environment for inference a...
Deploying to AWS SageMaker
This tutorial walks through the three steps that are required to deploy a Snorkel-built application to AWS SageMaker:
Deploying to Azure Machine Learning
This tutorial walks through two options for deploying a Snorkel-built model on Azure Machine Learning (Azure ML):
Deploying to Databricks
Snorkel Flow supports integrations for Databricks Workspace Model Registry and Unity Catalog, which is a unified governance solution for managing data and AI as...
Deploying to Vertex AI
This tutorial walks through the four steps that are required to deploy a Snorkel-built application to Vertex AI:
MLflow-compatible application deployments
Application deployments (see Deploying Snorkel-built models) can be packaged in the MLflowModel format. The MLflowModel package format is compatible with the st...