Project Icon

NSFWDetector

Refine Image Moderation with CoreML's NSFW Content Detector

Product DescriptionThe NSFWDetector is a streamlined 17 kB CoreML model designed to identify and differentiate between NSFW and suitable content in images, perfect for environments needing image moderation. It is accessible via Swift Package Manager or Cocoapods, maintaining a minimal app footprint compared to larger models. Supporting iOS 12.0 and later, the model leverages CreateML, requiring Xcode 10 or higher for building. Sensitivity can be customized using a confidence threshold, allowing tailored detection to user needs. The model can also be downloaded directly from the latest release for standalone use.
Project Details