jigsawstack / nsfw

Quickly detect nudity, violence, hentai, porn and more NSFW content in images.

  • Public
  • 2 runs

🚫 JigsawStack NSFW Image Validator – Replicate Wrapper

This Replicate model wraps the JigsawStack NSFW-detection API to provide sentiment classification for any text input.

Detect inappropriate content in images including nudity, explicit content, hentai, violence, and other NSFW categories using JigsawStack’s powerful moderation API.


🧠 What It Does

This model checks whether an image is Not Safe For Work (NSFW) and returns a response indicating if the image violates content policies.

It supports real-time image moderation for: - Nudity & sexual content - Hentai & explicit art - Violence - Unsafe or disturbing visuals


🔑 Inputs

Name Type Required Description
url string ✅ Yes Public URL of the image to be validated
api_key string ✅ Yes Your JigsawStack API key