Clean Text from Manhwa/Manhua
Image to clean
Type of text fill (blur, white, color)
Default: "white"
Run this model in Node.js with one line of code:
npm install replicate
REPLICATE_API_TOKEN
export REPLICATE_API_TOKEN=<paste-your-token-here>
Find your API token in your account settings.
import Replicate from "replicate"; const replicate = new Replicate({ auth: process.env.REPLICATE_API_TOKEN, });
Run clsandoval/magi-cleaner using Replicate’s API. Check out the model's schema for an overview of inputs and outputs.
const output = await replicate.run( "clsandoval/magi-cleaner:849ebb56f2ff68b925c64e68e0be6aa261e2a90d1d700c600264a1bae8cfce88", { input: { fill: "white", image_path: "https://replicate.delivery/pbxt/LxuEZnZURckPi3qFHh8bMQNr5eNTJjOBc86vg4fSSQiMQOka/32.jpg" } } ); // To access the file URL: console.log(output.url()); //=> "http://example.com" // To write the file to disk: fs.writeFile("my-image.png", output);
To learn more, take a look at the guide on getting started with Node.js.
pip install replicate
import replicate
output = replicate.run( "clsandoval/magi-cleaner:849ebb56f2ff68b925c64e68e0be6aa261e2a90d1d700c600264a1bae8cfce88", input={ "fill": "white", "image_path": "https://replicate.delivery/pbxt/LxuEZnZURckPi3qFHh8bMQNr5eNTJjOBc86vg4fSSQiMQOka/32.jpg" } ) print(output)
To learn more, take a look at the guide on getting started with Python.
curl -s -X POST \ -H "Authorization: Bearer $REPLICATE_API_TOKEN" \ -H "Content-Type: application/json" \ -H "Prefer: wait" \ -d $'{ "version": "clsandoval/magi-cleaner:849ebb56f2ff68b925c64e68e0be6aa261e2a90d1d700c600264a1bae8cfce88", "input": { "fill": "white", "image_path": "https://replicate.delivery/pbxt/LxuEZnZURckPi3qFHh8bMQNr5eNTJjOBc86vg4fSSQiMQOka/32.jpg" } }' \ https://api.replicate.com/v1/predictions
To learn more, take a look at Replicate’s HTTP API reference docs.
{ "completed_at": "2024-11-13T12:48:23.395565Z", "created_at": "2024-11-13T12:48:22.623000Z", "data_removed": false, "error": null, "id": "8jgs3harbxrge0ck4p8s3zh70r", "input": { "fill": "white", "image_path": "https://replicate.delivery/pbxt/LxuEZnZURckPi3qFHh8bMQNr5eNTJjOBc86vg4fSSQiMQOka/32.jpg" }, "logs": null, "metrics": { "predict_time": 0.759051199, "total_time": 0.772565 }, "output": "https://replicate.delivery/czjl/HII3XPfReHrFIEAUSeafeuXmNubiFLdXeruVgInPoLk5FSJ8E/cleaned.jpg", "started_at": "2024-11-13T12:48:22.636514Z", "status": "succeeded", "urls": { "stream": "https://stream.replicate.com/v1/files/fddq-cpqnwxfemtmchfwh64ry4mjdg5mkziv6ncvuapyq64ghzcy2giiq", "get": "https://api.replicate.com/v1/predictions/8jgs3harbxrge0ck4p8s3zh70r", "cancel": "https://api.replicate.com/v1/predictions/8jgs3harbxrge0ck4p8s3zh70r/cancel" }, "version": "849ebb56f2ff68b925c64e68e0be6aa261e2a90d1d700c600264a1bae8cfce88" }
View more examples
This model runs on Nvidia T4 GPU hardware. We don't yet have enough runs of this model to provide performance information.
This model doesn't have a readme.
This model is cold. You'll get a fast response if the model is warm and already running, and a slower response if the model is cold and starting up.
Choose a file from your machine
Hint: you can also drag files onto the input