Avoid Nightmares — NSFW JS

By Gant Laborde

You can use NSFW JS to identify indecent content without having files ever leave the client’s machine, even defensively if you can’t control the content being delivered. Try the demo: https://nsfwjs.com/ & 🌟 the repo https://github.com/infinitered/nsfwjs

We now return you to your normally scheduled blog post 😄

Ever see something that when you go to bed, those same images are there when you close your eyes? And I’m not talking about one of those good dreams, like Nicolas Cage or something, but the kind that if you have it on your screen when the boss walks by, you’ll be looking for a new job.

User input can be disgusting. One of my friends made their own online store back in the day, and it allowed for negative quantities. Malicious users would buy $50 of shirts, and then add MINUS $40 in shirts, effectively discounting their own orders!

Fixing user input with numbers is easy, but with images? Impossible!
Until now…

Machine Learning is doing amazing things, and now that it’s starting to find its way into JavaScript, those awesome things are going everywhere.

I could type an entire chapter on the innards of NSFW JS, but for now, let’s just focus on what it does.

Give NSFW JS an image element or canvas, and then simply call classify. You’ll get probabilities of the following 5 classes.

  • Drawing — Harmless art, or picture of art
  • Hentai — Pornographic art, unsuitable for most work environments
  • Neutral — General, inoffensive content
  • Porn — Indecent content and actions, often involving genitalia
  • Sexy — Unseemly provocative content, can include nipples

Each classification comes with a probability! Using those numbers and classifications, you can then take action… or just be amazed.

Did you know I’m 5.58% sexy?

If the application isn’t clear, allow me to elaborate.

Big companies have entire teams dedicated to removing offensive content, a luxury we can’t all share. Just like client-side form validation causes less work for your server, client-side content checking can cause less work for your team.

Scenario 1:
Imagine that a user is about to upload an obscene image and they get a message like this immediately, “Sorry! Something about this image has tripped some content warnings. You can still upload this image, but it will not be immediately available and will only go live after manual review.” They might just tone it down.

Scenario 2:
It can be passed on to user-to-user protection. Someone receiving a message from another can get a warning that what they are about to view may or may not be appropriate. They will acknowledge that it is OK before it is shown, and that can be done without ANY server processing whatsoever!

As the legality of user-uploaded content gets thicker, we need bigger, better tools to allow quality website without headache.

It’s easy. There are essentially three steps:

  1. Get the code into place
  2. Load the model on the client
  3. Classify an image

Nothing mindblowing here. Let’s get started and see it in action.

I’ll show you the Node styled usage. First, we bring in NSFW JS and since it’s a peer dependency, be sure to grab TensorflowJS if it’s not already in your project.

Now we can import our node module in the JS file we want:

The next thing we need to do is load the model. The “model” is the function that evaluates the images. I highly recommend you download and host these files like any other asset. Download them here. These files are 4MB shards for easy caching on the client machine. In my demo, I put them in the public/model/ folder.

If yours are the same way, then you can then load the model with that path.

Now that the model is in-memory with the client machine, we can classify image elements on the page.

That’s it? …That’s It!

The predictions (by default) return all five categories in order from most likely to least! e.g.

{className: “Drawing”, probability: 0.9195643663406372},
{className: “Hentai”, probability: 0.07729756087064743},
{className: “Porn”, probability: 0.0019258428364992142},
{className: “Neutral”, probability: 0.0011005623964592814},
{className: “Sexy”, probability: 0.00011146911856485531}

The sum of all probabilities should add up to 1 or 100%. Now do what you please with that data! Flag things that are over 60% sure, or just take the top item and forget the rest. The rigidity of checking can be calibrated to your specific implementation!

As a human, you have been training on image recognition for decades. So it’s safe to say you’ll see some false positives that seem obvious to you. Though these are often funny, small amounts of data bias come through in the results. As the data scraper gets improved, these biases will be eliminated. It’s a slow but good process.

For something like NSFW, I feel more false positives are better than letting something more dangerous slip through.

What JS library would complete without a live demo? It’s important to note that false positives will happen. For something like NSFW detection, recall is more important than precision. That being said, the model is being improved every day, and since it’s open source, I’d love your help in making it better!

Expect about 10% error. I’m working on making the model better, and more people are joining in to help!
Try it for yourself here — https://nsfwjs.com/
& Star on GitHub — https://github.com/infinitered/nsfwjs

This is just one use-case of useful futuristic stuff. It’s fun to come up with places you can use NSFW JS, or even contribute! I’d love to see your ideas on GitHub issues.

BUT… I’m going to leave you with a broader question. What other libraries SHOULD be out there that use Tensorflow JS? Machine Learning can do all kinds of cool things. Here’s a slide from my presentation earlier this year at Amazon:

Your imagination is the limiting factor with this tech. We at Infinite Red build a lot of cool things for web and mobile. If you’ve got an idea of how this tech can work for your company, let us know at hello@infinite.red

UPDATE — FEB 26, 2019
Twitter user @JustTenDads ran Jeff Goldblum’s face through the NSFW JS app, and it was spectacular. Check it here