If the company wanted to bury the search results for involuntary porn, it could, and it has done that with other kinds of content, like mug shots.
This spring, one of my students at Yale Law School, Sopen Shah, wrote a paper about Section 230.
The first is the shaming of the subjects of nude photos that is still all too common.
Yet knowing all that, Reddit and 4Chan can continue to host the photos, or link after link to them, with apparent impunity.
This is deplorable, or useful, or both.