How to Share Your Machine Learning Model in 4 Easy Steps

GradioHosted is a place for ML practitioners and developers to share their pretrained models with anyone

Abubakar Abid
4 min readFeb 22, 2021

If you’ve built a machine learning model that works well, you probably want to share it so that others can try it out. The Gradio library makes it really easy to create a shareable GUI & public link for your model, as long as the model is running on your local computer. But what if you need a long-term hosted solution? That’s what GradioHosted is for!

In this tutorial, I’m going to show you, step by step, how to create and deploy your machine learning model app on GradioHosted.

GradioHosted lets anyone use your model interactively at any time— even if they have no background in machine learning. Even more importantly, GradioHosted lets you avoid having to do any DevOps work, so that you can spend your time on building great models.

GradioHosted lets you focus on the model building and avoid having to solve out the infrastructure issues to host your machine learning model. Credit: https://xkcd.com/1319/

GradioHosted is used by academic researchers, hobbyists, and developers in industry. You can see the other models that are currently on it, including many from recent academic conferences, and how easy it is to use this density estimation model to count people in crowds:

Pretrained models on www.gradio.app/hub

It’s a quick process to get your model up too, so let’s get started!

The Gradio Piece

If you’re already familiar with the Gradio library (https://github.com/gradio-app/gradio), that’s great! If not, this quick start is a good place to get started. As a refresher, gradio is a Python library that lets you create UIs around your machine learning model by specifying 3 things: a Python function, input component, and an output component.

Step 1: Create a Gradio App File for Your Model

The first step is to create a file that launches the Gradio GUI for your model. Give it a simple name like demo.py. This file should:

  • Load the model
  • Define a prediction function using the model
  • Launch a Gradio interface with the prediction function and appropriate UI components

Here’s an example such file from the Noise2Same repo, which contains a state-of-the-art image denoising model:

You should run this file locally (for example, by running python demo.py) to make sure that the Gradio app launches successfully.

If the Gradio app successfully launches, you should see a message similar to this^

Step 2: Publish your model as a public GitHub repo & include a requirements.txt file

Once your code is working, make a public GitHub repository for your account with all of the necessary files, including the Gradio app file that you just created.

We also need to make sure that we designate all of the Python libraries that we will need as dependencies, along with their versions if necessary. So create a requirements.txt file in the root of your repo (you can use pip freeze for this). In the case of the Noise2Same repo, this is what the requirements look like:

Once you have added these files to your GitHub project, push your updated repo to GitHub and make sure its visibility is set to public.

Step 3: Create a Gradio Premium account

Hosting models permanently requires a Gradio Premium account, which you can sign up by visiting the Gradio website (www.gradio.app) and clicking “My Account”

If this is your first time creating an account, you will authenticate with your GitHub credentials. Make sure that you sign in with the same GitHub account that you created the repo under.

You will need to provide your billing information, and then you will see a button that says, “+ Add Repo”. Click on the button to add your repo.

Step 4: Provide your repo information to launch your container

You’ll be presented with a form that looks something like this. Select your repository and branch, and type the name of the Gradio app file. Once you’ve put your information in, you’ll see some messages confirming that the appropriate files have been found in the right places:

When you’re ready, click Launch, and let GradioHosted take it from there!

  • Behind the scenes, GradioHosted clones your repository in a Python3 container and installs all of your dependencies
  • It then launches the Gradio app from your app file and creates a public URL where your model will be accessible
  • Finally, your latest model is added to list of models that you currently have running, and you the option to make it visible on GradioHosted for the wider machine learning community
  • From the My Account page, you will have the ability to do things like edit the title and profile picture under which your model shows up on the GradioHosted page as well, giving you complete control over how your model appears.

If there any errors in the deployment process, you’ll receive an email explaining what went wrong — allowing you to address the issue and try again.

And that’s it! If you’ve made it this far, you’ve found a great hosted solution for your machine model. GradioHosted will host your model with a GUI, allowing anyone to use your model regardless of technical background, from anywhere around the world. Try it out with your favorite model and let us know what you think!

If you have any questions, feel free to reach out to us on Twitter @GradioML.

--

--