Introducing Gradio Clients
WatchIntroducing Gradio Clients
WatchGradio is a powerful and intuitive Python library designed for creating web apps that showcase machine learning models. These web apps can be run locally, or deployed on Hugging Face Spaces for free. Or, you can deploy them on your servers in Docker containers. Dockerizing Gradio apps offers several benefits:
Let's go through a simple example to understand how to containerize a Gradio app using Docker.
First, we need a simple Gradio app. Let's create a Python file named app.py
with the following content:
import gradio as gr
def greet(name):
return f"Hello {name}!"
iface = gr.Interface(fn=greet, inputs="text", outputs="text").launch()
This app creates a simple interface that greets the user by name.
Next, we'll create a Dockerfile to specify how our app should be built and run in a Docker container. Create a file named Dockerfile
in the same directory as your app with the following content:
FROM python:3.8-slim
WORKDIR /usr/src/app
COPY . .
RUN pip install --no-cache-dir gradio
EXPOSE 7860
ENV GRADIO_SERVER_NAME="0.0.0.0"
CMD ["python", "app.py"]
This Dockerfile performs the following steps:
GRADIO_SERVER_NAME
environment variable to ensure Gradio listens on all network interfaces.With the Dockerfile in place, you can build and run your container:
docker build -t gradio-app .
docker run -p 7860:7860 gradio-app
Your Gradio app should now be accessible at http://localhost:7860
.
When running Gradio applications in Docker, there are a few important things to keep in mind:
"0.0.0.0"
and exposing port 7860In the Docker environment, setting GRADIO_SERVER_NAME="0.0.0.0"
as an environment variable (or directly in your Gradio app's launch()
function) is crucial for allowing connections from outside the container. And the EXPOSE 7860
directive in the Dockerfile tells Docker to expose Gradio's default port on the container to enable external access to the Gradio app.
When deploying Gradio apps with multiple replicas, such as on AWS ECS, it's important to enable stickiness with sessionAffinity: ClientIP
. This ensures that all requests from the same user are routed to the same instance. This is important because Gradio's communication protocol requires multiple separate connections from the frontend to the backend in order for events to be processed correctly. (If you use Terraform, you'll want to add a stickiness block into your target group definition.)
If you're deploying your Gradio app behind a proxy, like Nginx, it's essential to configure the proxy correctly. Gradio provides a Guide that walks through the necessary steps. This setup ensures your app is accessible and performs well in production environments.