NSFWDetector Project Introduction
Overview
NSFWDetector is a compact and efficient CoreML Model specifically designed to analyze images for nudity. Developed and trained using CreateML, its primary function is to discern between potentially explicit content, like pornography, and appropriate images, with a special focus on differentiating between images of Instagram models and explicit pornographic content. The entire model is impressively lightweight, with a size of just 17 kB.
Usage
Implementing NSFWDetector in a project is straightforward, particularly for developers using Swift. The model can be utilized on devices running iOS 12.0 or later. Here's a simple usage example in Swift:
guard #available(iOS 12.0, *), let detector = NSFWDetector.shared else {
return
}
detector.check(image: image, completion: { result in
switch result {
case let .success(nsfwConfidence: confidence):
if confidence > 0.9 {
// 😱🙈😏
} else {
// ¯\_(ツ)_/¯
}
default:
break
}
})
Developers have the flexibility to adjust the sensitivity of the model by changing the confidence threshold, allowing for stricter content filtering as needed.
Installation
NSFWDetector can be added to a project using either Swift Package Manager or Cocoapods:
-
Swift Package Manager:
dependencies: [ .package(url: "https://github.com/lovoo/NSFWDetector.git", .upToNextMajor(from: "1.1.2")) ]
-
Cocoapods:
pod 'NSFWDetector'
It is important to note that since the model was created using CreateML, Xcode 10 or above is required for compilation.
App Size Considerations
One of the distinct advantages of using NSFWDetector is its minimal impact on app size. At just 17 kB, it is significantly smaller than other libraries, such as those using the Yahoo model for similar purposes.
Standalone Model Usage
For those interested in using just the model without the accompanying detection code, the NSFWDetector MLModel file is available for direct download from the latest releases on its GitHub repository.
Feedback and Support
The NSFWDetector project openly encourages feedback. Users who encounter any issues or have suggestions are invited to reach out via email at [email protected] or Twitter through the LOVOO Engineering account.
Author and License
The creator of NSFWDetector is Michael Berg. The project is released under the BSD license, providing flexibility and freedom for usage in various applications. Detailed licensing information can be found in the accompanying LICENSE file in the project repository.
NSFWDetector offers an effective and lightweight solution for those needing to integrate nudity detection into their applications, maintaining a balance between performance and minimal resource utilization.