Run a model from Python
Learn how to run a model on Replicate from within your Python code. It could be an app, a notebook, an evaluation script, or anywhere else you want to use machine learning.
👋 Check out an interactive version of this tutorial on Google Colab.
Install the Python library
We maintain an open-source Python client for the API. Install it with pip:
pip install replicate
Authenticate
Authenticate by setting your token in an environment variable:
Run predictions
You can run any public model on Replicate from your Python code. The following example runs stability-ai/sdxl:
import replicate
output = replicate.run(
"stability-ai/sdxl:39ed52f2a78e934b3ba6e2a89f5b1c712de7dfea535525255b1aa35c5565e08b",
input={"prompt": "an iguana on the beach, pointillism"}
)
# ['https://replicate.delivery/pbxt/VJyWBjIYgqqCCBEhpkCqdevTgAJbl4fg62aO4o9A0x85CgNSA/out-0.png']
Some models, like replicate/resnet in the following example, receive images as inputs. To pass a file as an input, use a file handle or URL:
image = open("mystery.jpg", "rb")
# or...
image = "https://example.com/mystery.jpg"
replicate.run(
"replicate/resnet:dd782a3d531b61af491d1026434392e8afb40bfb53b8af35f727e80661489767",
input={"image": image}
)
URLs are more efficient if your file is already in the cloud somewhere, or it is a large file. Files are output as URLs.
Some models stream output as the model is running. They will return an iterator, and you can iterate over that output:
iterator = replicate.run(
"mistralai/mixtral-8x7b-instruct-v0.1",
input={"prompt": "Who was Dolly the sheep?"},
)
for text in iterator:
print(text)
🐑
D
olly
the
sheep
was
the
first
mamm
al
to
be
successfully
cl
oned
from
an
adult
cell
...
Next steps
Read the full Python client documentation on GitHub.
You can also run models with the raw HTTP API. Refer to the HTTP API reference for more details.