Hi, all! I have a favor to ask. I have an alpha version of my app, Transitions, that can classify images as NSFW with an 80% or higher accuracy, most of the times (top-1 accuracy as it is known in the ML world). It uses Mobilenet V1. The predictions are made on-device. So your data is not used for predicting whether an image is NSFW or not.
I was wondering if any of you would be interested in signing-up to be in an Internal Test Track? I already run an Open Beta program but before releasing a beta, I’d like to test it with a closed loop of users. Sorry, but my app is Android-only. You’ll need a device with Android 6.0 or higher. So please PM me if you are interested. Once you are part of the testing program, I am hoping you would also be OK to submit feedback so I can improve the model’s accuracy.
Why NSFW classification?
Transitions serves user-generated content from flickr, 500px, and Unsplash. Even with category filtering, inappropriate images show up all the time. I currently have an ESRB rating of Teen on Google Play, because of this. I can’t change that rating; even with the automated classification and dynamic blurring of such images. But I want to give my users the ability to browse high-quality photos on Transitions without the uncertainty of what they’ll see next.
What do you (as a tester) get from this?
Uh, the joy of helping me? Early access? Take your pick.. 😀