I don't have any sympathy for Pornhub here. Their entire business is built on stolen content uploaded by anonymous users, and it wasn't until they hollowed out the rest of the industry through theft that they started looking beyond it.
I would go so far as to say that the reason Pornhub and other tube sites haven't taken problems like child porn and revenge porn seriously is because doing so would undermine the foundation of their business: stolen content.
On the other hand, this might ultimately work in their favor if it has the effect of raising the bridge they've just crossed. If it gets harder host stolen content, Pornhub has to worry less about someone else doing to them what they've done to others now that they are making a switch to paid content.
> Their entire business is built on stolen content uploaded by anonymous users,
Not disagreeing with you, but, to be fair, that was exactly Youtube's recipe for success, and still is to some extent. I never understood how Youtube could get away with that.
My understanding is that copyrighted content on Youtube is often uploaded under the guise of anonymous users, but this is illusory. Unauthorized videos are taken down with vigilance, we see this happening to content creators all the time for no good reason. If these videos get the thumbs up to stay up, they're in cahoots with copyright holders somehow (most likely) or are deliberately ignored for some other reason. One think you'll notice is that not just corporate sponsored content is region-locked; sometimes "anonymous" user videos get region locked. That is not a coincidence.
All of which to say, I expect this to be the same case on PH. There's the usual wackamole game of unauthorized content being taken down, but much of what stays up from unauthorized sources probably does so out of permission that is not apparent on the surface.
I think streaming companies place a premium on the appearance of content coming from users, and this is the result. If I'm right, then changes may not significantly alter the level of traffic to PH, which is what everyone is soothsaying about: that users will just go elsewhere. But I'm curious what's going to happen.
Like you said, PH's model looks to ape youtube's, and the latter does not appear to be entirely reliant on "stolen" content.
> Unauthorized videos are taken down with vigilance, we see this happening to content creators all the time for no good reason. If these videos get the thumbs up to stay up, they're in cahoots with copyright holders somehow (most likely) or are deliberately ignored for some other reason. One think you'll notice is that not just corporate sponsored content is region-locked; sometimes "anonymous" user videos get region locked. That is not a coincidence.
This is all much less nefarious than you make it sound.
When a rights-holder finds their IP in a video, they have the option of how to proceed. They can choose to take the video down completely, run ads and receive all or a share of the revenue, etc.
If they choose to leave it up, it's almost always because they are receiving all of the revenue from the video now so they have no reason to take it down. There's no smokey back room deals going on here, this is just the rights enforcement tools provided by YouTube.
Rights-holders do not always hold global rights to content and if the video contains content from multiple rights-holders, there are several competing claims that all need to be resolved.
The reason the videos uploaded by users eventually end up resembling the videos uploaded by the rights-holder is because YouTube provides a mechanism for rights-holders to enforce the same restrictions they operate under on their content even when they weren't the ones to upload it.
Source: I work in a company that does this, though not directly on that side of the business.
There are account penalization i.e. a strike system associated with frequent infringement complaints, and it doesn't take many. While it's certainly the case that copyright holders will simply take over the revenue from unauthorized videos, my expectation is that most of these uploaders are not unwitting anonymous types: they actually represent the company. I had read an article years ago detailing that this can be the case but I can't be arsed to find it right now.
It's not a huge leap when you consider it's also been revealed that a good chunk of torrents at one time had been seeded by IPs belonging to corporates.
Youtube has broad 'covenant not to sue' contracts with various rights holding organizations which allows them to safely leave up infringing works while the rights holders can avoid paying residuals to artists whos contracts lacked the foresight to anticipate this kind of fraud.
AFAIK porn performers don't generally get paid residuals, so there probably less interest in this sort of skulduggery.
This most certainly wasn't the case for YouTube in the early days either. Once Google took over they were WAY more strict about locking down infringing works.
> Unauthorized videos are taken down with vigilance
This is not universally true. For example, you can go on YouTube right now and probably find all sorts of music videos not hosted by the actual artists, so long as it’s older or more obscure music. Plus at this point it’s kind of moot; they already gained their user base from the first several years of illegal new content.
If youtube user bigbubba420 uploads a music video for an 80s synthpop song he doesn't own, the copyright owner for that video informs youtube of their ownership claim and revenue from the video an anonymous user uploaded is redirected into their pockets.
Also if travel_vlogger happens to walk by a store playing the radio, the ownership of their video and revenue is given to the music company and you can't do anything about it. Just easier to steal from the little guy.
This is true, though now bigbubba420 gets a strike on his account for the complaint and inches closer to a ban. And music uploaders throw on a ton of content, so I don't see how it can merely be a case of ad revenues taken over and continuing to do the same with impunity.
Be aware you're comparing Google to a much smaller company. Of course they had the ability to do full DRM search much sooner. I'd look through the lens of technical hurdles much before I'd look through the lens of moralism.
This is a good point. I don't think MindGeek is as sophisticated in detecting and taking down infringing content. However, complaints do get filed and videos do get taken down for infringing.
After a Lot of suffering. I believe there was an incident recently in which Pornhub refused to take down videos of a minor in what was an obvious situation of revenge porn.
Yeah, well that means they got their priorities reversed badly, for this kind of business.
I think the current model should have been the default from the start - if verified, paid content creators were the only ones allowed, they would get a much bigger share from their videos.
> Unauthorized videos are taken down with vigilance,
Yeah, right. Tell that to the flood of channels which will take cut single scenes out of tv series and twitch recordings and upload them to YT monetized.
I would expect it applies to the separate cases, but not to an account which exists only for that purpose. There are cases where whole episodes / movies are split into pieces and posted in a playlist.
Despite people complaining about how draconian DMCA take down notices are, the DMCA safe harbor is incredibly generous. Service providers have no affirmative duty to police user uploaded content unless they have actual knowledge of infringing materials or "facts or circumstances from which infringing activity is apparent."
It's basically a don't ask/don't tell system.
Courts have been reluctant to require any active investigation even when faced with very large number of valid take-down requests.
Youtube is more proactive than most (CotentID) due to settlements and agreements with its major partners.
I mean, they were facing down a bilion-dollar lawsuit and only avoided that on a technicality. Then spent a huge amount of money (and reputation) on ContentID.
Is that even really true? Movies ETC that are aggressively removed and don't stay on youtube for long if at all. Maybe it used to be more lenient, but I can't remember the last time I saw something on youtube like that.
That's the story for every publisher or platform. Even the United States didn't take copyright seriously until it was big enough and had something worth protecting.
It probably didn't have much to do with the law, clearly. I don't know why non-lawyer HN commenters are so obsessed about litigating the law.
People just really like it, including lawmakers themselves.
It's probably like with Uber - its riders ended up enacting the law that made the cheap prices possible. Or Airbnb. Or a lot of other stuff. Operating in a legal gray area for years.
Or maybe its more comparable to the difference in punishment between doing cocaine and doing crack. Although that Toronto mayor did do crack, so I don't know if cocaine is strictly the drug among lawmakers. Certainly preferred rich people, including rich politicians.
The really twisted thing is, what does that say about porn? Sure, Nicholas Kristoff might be reprinting talking points by a few purity crusaders. But for some things, like the unverified content being not just pirated, but sometimes abusive, they might be right?
I believe YouTube could get away with it due to a lucky right time purchase by Google which made content owners put back the swords they were starting to pull because "scary money monster that sends us traffic might get bad".
Not to mention, that likely it has an overall positive effect on the old school "content industry".
After all, if there are old songs on YT then people can send them to others (see RickRoll), listen to them (playlists), keep them in the collective zeitgeist. So copyright holders have avenues to profit from these activities. If YT purges content likely those tracks would just disappear fast from the minds of folks who mostly listen music on YT.
Absolutely. In the olden days, the industry would make direct revenue off of this discovery/remembrance through sampler or collection albums/CDs. Usually cheaper than regular media, but not free. What they lost in revenue with YouTube, they gained in the frictionless process of a potential customer tasting or revisiting new material, and the social networking gains through instant sharing to new audiences. I assume the industry has seen this as a net win, and a win-win with their customers. The formal legal protections get in the way, and so they are in practice ignored unless the releases get out of hand. Seems to be working...
Same dynamic with archive.org and used book sellers, although there everything is PD. Would be nice for the not-yet-PD-but impossible-to-monetize book content from, say 1930-1980, had the same scheme in place for online content. I suppose the books-to-borrow function on archive.org or the online lending platforms in local libraries provide that function.
Pinterest is even worse because it hijacks image search with their stolen content pretending it's the only available source and the source, while at same time forcing browsing users to register.
I don't think Pornhub behaves in same way neither YT or more comparable imgur.
The relevant law here is the DMCA, which gives safe harbor to anyone hosting 3rd party content if they respond to timely takedown requests, which YouTube does.
My understanding is YouTube technically doesn't deal with many DMCA take downs. Their reporting system is technically a voluntary system in front of DMCA.
Large IP holders are happy to use it because it heavily favors them. Large IP holders also don't have to worry about repercussions for false or incorrect DMCA claims.
I've been in a few discussions on Reddit where people said this is a half-truth. Perhaps someone can confirm?
From Wikipedia[0]: § 512(c) also requires that the OSP: 1) not receive a financial benefit directly attributable to the infringing activity, 2) not be aware of the presence of infringing material or know any facts or circumstances that would make infringing material apparent, and 3) upon receiving notice from copyright owners or their agents, act expeditiously to remove the purported infringing material.
This is an 'and' which I take to mean safe harbour takedown (the 3rd clause) only applies if the other two are also true. So the host must not know it exists (plausible for YouTube) and not receive a direct financial benefit (does Ad revenue count?).
DMCA safe harbor (copyright) instead of section 230 (moderation for content), but yeah, that sounds right. Before Content ID existed, they could convincingly say they didn't have automatic means and they were taking all commercially reasonable efforts to respond to manual reports, and meanwhile their staff couldn't keep up with people uploading new infringing content.
Oh come on, don't make me spell it out for you. Revenge porn preys on women, child porn preys on children. Having these videos up, and allowing people to upload them, actively hurts people. Nobody is getting hurt if people watch Family Guy on youtube.
Individual humans are not directly demeaned and violated by copying TV shows.
They are by non-consensual porn.
I personally believe strongly that porn demeans, dehumanizes, and violates the people in it, even when it's consensual (and my understanding is that consensuality and non-abusive working conditions are much rarer than most porn consumers think).
I realize that opinion would be quite unpopular around here.
you comment on "non-consensual porn" may be what the GP meant, but s/he used the term "stolen porn" which I took to mean simply a copy that the user was unauthorized to upload, i.e. I download a movie then share it, not so-called revenge porn. I'd say in the first case, they are roughly equivalent wrongs, but in the case of revenge porn, I'd say distributing that publicly is an much higher moral wrong.
Disagree. ContentID was a proprietary tech for content recognition that made YouTube not get inundated with copyrighted material. YT is successful because they handled the issue, not because they didn't.
YouTube was successful long before ContentID. Perhaps they managed to stay successful by handling the issue, but their initial success was certainly not due to handling the issue.
From the article, Pornhub only had 118 incidences of child porn over the last 3 years whereas Facebook had 84 million incidents of child porn during the same time. So it sounds like Pornhub is taking child porn seriously, and Facebook isn't.
Two more likely possibilities: (1) Pornhub wasn't really checking, which is an easy way to get good stats.* (2) Many of Facebook's 84 million incidents took place on private groups or Messenger chats with small numbers of participants, whereas a single PornHub upload (counted as "1 incident") could easily reach hundreds of thousands of people. Fundamentally difficult to compare private communications networks with broadcast systems, they provide different opportunities for problematic content.
* ETA: Facebook is an industry leader on CSAM scanning, and has developed both its own algorithms and media databases. So your comment that "Facebook isn't taking the issue seriously" seems like exactly the wrong diagnosis. Facebook's high numbers are a consequence of putting serious effort into this area. (Whether scanning of private messages is actually effective in stopping distribution of the content, that's something I'm more skeptical of.)
> (1) Pornhub wasn't really checking, which is an easy way to get good stats.*
Those were external checks, no self-report. So the real question is whether those checks were insufficient and just found the tip of the mountain, or whether pornhub was acutually doing serious checks for this stuff and doing quite well in that.
In later case it also opens the question on whether the reports were really childporn, or just some disputable case which be acceptable in some country or time, and problematic in other countries or times. Or whether they just missed them. Or whether there is some darker reason behind them.
I think it's very likely that pornhub really had a system to prevent the nasty stuff, after all they are were big and old and not stupid enough to ignore those easy targets that would kill them fast. Their fast reaction this time shows that they are very focused on surviving.
Ive personally taken dozens of reports where people were 100% sure and disgusted, even from some moderators who were supposed to know more about this - and it turned out the people were all over 18.
I'm sure we can find stories on both sides of this argument.
I lived in South East Asia for a while and the cell phone footage of sexual interactions between middle school students is what I reported. These are girls wearing middle school uniforms embroidered with the school's name on the shirt [1] which indicates that the woman is under 18.
Nobody would bother to replicate the uniform for 2 minutes of shaky cell phone footage. It was underage material. Plain and simple.
My own ex-girlfriend's photographs were uploaded to XHamster by her ex-boyfriend along with a copy of her passport. She's 17 in the images. Took us over a year to get them taken down.
I feel for you situation, and I'm sorry you and those around you have had such issues.
Indeed your mention of the uniform, along with it being shaky cell phone imagery, would cause me to moderate such an upload if it was brought to my attention - I would stop it from public availability and notify the uploader that they could respond with proof of age and rights if they'd like to prove the claim wrong. (99% of the time that does not happen)
Sadly, I can't say that you are 100% right on the uniform thing - especially in certain places, copying such uniforms or acquiring exact copies and using such for videos is kind of common.
Fake-revenge porn is actually a thing, and has been for a long time. There are some slick web sites out there that try to make it look like it's 'stolen / hidden video' of girls next door - but it's actually now budget / average girls selling vids cheap and it's packaged in a way to make it seem real.
With that being said, there are also some 'parasite sites' that take imagery of people out of context and use it for sexualized viewing without the person's consent - and I abhor that.
There are also people who do revenge porn, and blackmail porn I would say it's likely even more prevalent. That kind of thing really ticks me off too.
I don't know a lot about xhams, but I'm pretty sure they play with different jurisdictions and rules that pornhub and some others - so I'm not going to comment on the way they do things.
I will mention that I do get quite a few 'takedown requests' from cam models, so many do not understand how affiliate promos work, partner promotion agreements, APIs and such.. sometimes they also push dmca complaints to google and others in an attempt to rush a pic removal... I have also seen some people working with these models who get 'creative' in their removal requests, putting in emails to me (and likely to our hosting companies, google, etc) - that "not only are these pics not authorized for use, but they we're under 17 when they were made" -
As to where I am sure this works to speed up the deletion process for some, it's a shitty thing to do for many reasons - but it is done a fair amount. With so many of these fake take down notices, it's not easy for a moderating team to sort out what is legit and what is not.
It's obvious to me that sometimes this is done by a new boyfriend / husband who is trying to remove past history of their lover's porn work.
It also clogs up our system of support emails. So there is much more to these issues than most people who would file a legit request would know.
Here's the original press statement from Pornhub's blog:
>Over the last three years, Facebook self-reported 84 million instances of child sexual abuse material. During that same period, the independent, third-party Internet Watch Foundation reported 118 incidents on Pornhub.
Organizations like the IWF rely on reports, whether self-reported or not. Most people report CSAM on the website they're visiting instead of going to a third party (at least I do unless nothing is being done about it). Therefore it's not a surprise that the IWF only reported 118 incidents on Pornhub. It would be interesting to know how many instances Pornhub self-reported during the same time period since a high number is actually a good thing. I'd imagine that a sizeable chunk of the Facebook reports were quickly taken down due to being recognized by CSAM detection algorithms.
The PornHub count was an external study, the Facebook count was internal. So if either is effected by internal standards not counting things, its the Facebook number.
Isn't Pornhub's content always public, whereas Facebook's can be segregated to smaller groups of people? If that's the case it seems that illegal content could remain undetected for far longer on FB rather than Pornhub.
I’m not an expert on PH, but from what I understand from other sources they have a mix of public content, open-to-all-paid-customers content, and private content that is available only to subscribers to particular channels.
I don’t know if they have any content owner (well, submitter, given the level of piracy) controls on who can see content at the specific user level (whether by restricting allowed subscriptions to channels with an owner-approval process or by having content restricted by something other than subscription status.)
A former co-worker was telling me about a porn streaming company she worked for in Spain. They stole content from other sites, the other sites stole from them and their competitors. No one wanted to go to courts because no one had clean hands so they started breaking into each other's buildings (self hosting on-site was the norm at the time) and smashing up servers. Eventually they all hired security made up of less than reputable guards who probably were the same guys who had been breaking in to beat up servers. She left around that time to work for more mainstream employers but it does point to the shady beginnings of many of these companies.
Both Google and YouTube contain stolen and illegal content. There are lots of full length pirate movies on YT for instance. It doesn’t make it right but I just mean all companies should be equally judged by rules and regulations
What you may not realize is that if that "stolen" media is logged in ContentID, both Google and the proper rightsholders are making money by running ads off every view (if they decide not to block it outright.)
I believe many folks don't understand this and therefore wonder why random YT accounts can post clearly copyrighted material. Content owners can fairly easily capture the monetization of these videos even if its not under their account. Oftentimes it's best to just leave a video up especially if it has many views (and strong search visibility) and collect the $$ from the ads on it.
Nope. YouTube just takes independent content down. On the other hand, YouTube allows original creators to monetize over those pirated content, by transferring all ads proceedings.
On the other hand, there's naked yoga on YouTube so....
As I pointed out the other day in the comments of an article about Crunchyroll, they're actually a much better example than Pornhub. Much like YouTube, sites like Pornhub have the excuse that they ostensibly exist to let people upload their own, often amateur, content - and there was a lot of really good amateur content on there. Which, naturally, the commercial porn industry wanted to eradicate just as much as the pirated stuff because it was their direct competition. Part of the problem was that there's an inherent assymetry to this - it only takes a few really keen exhibitionists wanting to show off to entertain a much larger group of fans across the globe. Another part is that there's all kinds of niche content that, despite being both consensual and legal, isn't well served by mainstream porn (and in some cases can't be due to Masterard and Visa).
Crunchyroll at least needed a team of translators to create subtitles for Naruto (and other anime circa the mid 00s). That is to say: Crunchyroll may have copyrihgt-infringed upon anime, but they added value. Naruto and Shippuden never would have found a subtitled audience without that team of translators. (And the English dub was relatively low quality back then, believe it)
I'm not sure how Youtube or Pornhub adds value to the content.
> Crunchyroll at least needed a team of translators to create subtitles for Naruto (and other anime circa the mid 00s). That is to say: Crunchyroll may have copyrihgt-infringed upon anime, but they added value.
Most of the content on Crunchyroll was fansubbed by various IRC and torrent groups and used without permissions of the original copyright holders or of the fansub groups and without credit to the fansubbers. So, actually, no, they didn't need a team of translators, and there's a lot of anime fans who refuse to do business with Crunchyroll because of their betrayal of the fansub community.
I remember Dattebayo fansubs, one of those torrent fansub groups you're talking about. I remember when they shutdown and announced support for Crunchyroll, suggesting that legitimate paid streaming was the future for anyone who actually wanted to support the anime subtitle community.
Perhaps I wasn't remembering the full politics of mid 00s subtitle culture... its been a very long time since then. But for Dattebayo at least, they were quite clear on throwing their support behind Crunchyroll, to the point where it was rumored that Dattebayo was probably working with Crunchyroll. (Though I'm not sure if that was ever confirmed)
With hundreds of thousands of torrent hits per week, Dattebayo fansubs probably was one of the largest torrent fansubbers of that time. I tried to search for a history of those groups, but alas, it seems lost to time. I just got my personal memories to guide me.
---------
You're right in that those early fansubber / torrent / IRC groups were all separate groups with different politics. But from my perspective, the big groups (ie: Dattebayo) were definitely all in on supporting Crunchyroll.
I think your retrospective is very Dattebayo centric and distorted.
Most big fansubbing groups hated CR back in the days when CR started, since CR took their fansunbs to fuel their business.
Once CR started to make/pay for their own translations (also hiring translators from some groups in the background) the situation got better, since then a clear, legal path was visible for the community.
I don't know anything about any of these groups (I do like anime), but they had to know from the beginning that voluntary subtitling would never make any money. Even if copyright law were intended to help regular people rather than giant corporations, their work was entirely based on material they had questionable "right" to use in the first place. They scratched their own itch, and later CR had an itch too. It could be that Dattebayo always had a more complete understanding of the situation than other fansub groups had.
> but they had to know from the beginning that voluntary subtitling would never make any money.
Exactly zero of the fansubbing groups ever did it to make money. For most people, they were in it for the community, which included things like the prestige of having your name associated with a popular/successful project. The reason the fansub community was upset at CrunchyRoll was because not only did they take the content, they edited it to remove the intros and other forms of attribution to the original fansubbers.
If they had left in attribution, nobody would have cared, and CrunchyRoll would have been treated like any number of other torrent to streaming sites that existed.
> Most big fansubbing groups hated CR back in the days when CR started, since CR took their fansunbs to fuel their business.
I mean, Dattebayo definitely criticized CR at first. But by the time Dattebayo closed shop, they were highly supportive of CR. That's why its important to remember the names of these groups. Do you remember the large-fansub group and/or which anime you're talking about?
Its hard for me to think of a bigger group than Dattebayo back then. Dattebayo covered Naruto and Bleach subs, as well as a bunch of lesser known anime.
During the illegal content days, Crunchyroll wasn't subbing the videos. Those were uploaded by dedicated fansub teams who did the actual work and CR was just another anime tube site.
I’m not saying it’s a direct comparison, just that bootstrapping off of content theft is pretty par for the course for the internet, be it individual creators or entire websites. I also don’t see it as a good thing, but at least with anime fansubbers it is kind of a begrudgingly accepted reality that this kind of activity helps create the market that allows the legitimate player to exist.
It seems somewhat arbitrary where the line is drawn between "growth hack" and "clearly undermining copyrighted content"
Thinking specifically about Napster and MegaUpload. Is it because there was no value beyond simply sharing files? If there was a presentation layer + user accounts on top, would they have been spared?
There was value beyond sharing files, music sharing services had decent search and music previewing.
I think a music sharing site I used let you see the other music files a user was sharing, so if you thought they had good taste you could check out other things too.
Not that I'm defending the theft or IP or technology, but it is interesting how some things we consider stealing, others homage and some that we simply ignore. No one would complain if spies from your own country stole from a rival nation during a war (hot or cold). The US used to steal technology from Britain and the rest of Europe.[0]
The more you magnify on any country's history, the more prevalent shady dealings comes up. This goes doubly so for superpowers. It seems that at some point in time, everyone has done unscrupulous things to get ahead and we often forget to be introspective.
On a personal anecdote, I dabble in fashion design and it continually baffles me as to what is considered "homage" when large, established companies blatantly rip off designs. Large companies like H&M or Zara can literally steal pieces from famous designers, make it worse and cheaper and sell it in malls worldwide. Obviously, when an Asian brand does the same thing, it's a knockoff. Peeves me to no end.
I always wondered why YouTube was required to be quite strict with policing such matters but many pornography websites somehow remain quite operational despite the vast amount of obviously infringing material.
YouTube is much stricter than what the law requires. I think the general consensus is that YouTube tries to stay on MPAA's and RIAA's good side to keep them from lobbying for stricter laws. As long as it's easier to talk to YouTube than to write a new law everyone profits (except the independent content creators).
Paid porn doesn't have a strong lobby, so there is no similar pressure on PornHub/MindGeek.
I would not be at all surprised if the American jury system would be fairly against plaintiffs if they're in any way connected to pornography considering the American moral values.
Well at issue is why, hypothetically, YT or PH might find themselves before the bar.
YouTube would be afraid of being sued for copyright infringement. PornHub is legitimately afraid of being indicted in a criminal conspiracy to distribute child pornography.
With respect to just the copyright claim, I don't think there would necessarily be any bias, as it would be pitting pornography producers vs. pornography distributers. Both pretty slimy if you feel that way.
Hah! This comment made me think about how I personally started to both really hate advertising/advertising companies, and I also find it more morally questionable than I find porn, to the point that if someone told me they worked for google or some other advertising network, I would silently be judging them for their life choices, but if someone told me that they worked for a porn company I'd probably be like "oh, what's that like?". Right or wrong, advertisement is leaving a very sour taste behind, to the point where for me personally it seems to have overtaken other things, like porn.
Surely you must mean the targeted advertisements and data collection necessary for that. I don't see anything inherently wrong with an advertisement in and of itself, and it certainly isn't morally questionable. Companies need consumers to be aware of their products in order to sell them.
> if someone told me they worked for google or some other advertising network, I would silently be judging them for their life choices
I strongly agree with Banksy's view on advertisement:
People are taking the piss out of you everyday. They butt into your life, take a cheap shot at you and then disappear. They leer at you from tall buildings and make you feel small. They make flippant comments from buses that imply you’re not sexy enough and that all the fun is happening somewhere else. They are on TV making your girlfriend feel inadequate. They have access to the most sophisticated technology the world has ever seen and they bully you with it. They are The Advertisers and they are laughing at you.
You, however, are forbidden to touch them. Trademarks, intellectual property rights and copyright law mean advertisers can say what they like wherever they like with total impunity.
Fuck that. Any advert in a public space that gives you no choice whether you see it or not is yours. It’s yours to take, re-arrange and re-use. You can do whatever you like with it. Asking for permission is like asking to keep a rock someone just threw at your head.
You owe the companies nothing. Less than nothing, you especially don’t owe them any courtesy. They owe you. They have re-arranged the world to put themselves in front of you. They never asked for your permission, don’t even start asking for theirs.
– Banksy
While I think there are some forms of advertisement that are acceptable or even desirable, what we see on the internet is not that, mostly due to the tracking and targeting, which almost every advertisement company is doing, but also because adverts are inherently trying to trick, pressure or encourage you to spend money on something even if you don't really want or need it.
> Maybe try some introspection.
I have. I've come to the conclusion that I judge people who work in industries I find morally questionable.
You don't see any hint of irony in that statement from someone who leaves artwork in public spaces without asking the general public for permission first? The same artist who destroys his own art, which he left in public, when someone tries to "own" it?
Claiming that everyone who works at Google works on Ads is like saying everyone who works at Amazon moves boxes in a warehouse.
I’m not making a value judgement on Banksy, just that I agree with that particular statement of his.
Re: second paragraph, I didn’t claim that everyone at google works on ads, you made that up. You don’t have to work on an immoral thing just an immoral employer for me to question your morals.
YouTube made their deals with the various industries pretty early on IIRC. Not sure there was a decision as to who did or didn't have to make these deals as much as visibility and their own willingness to make these deals just resulted in earlier deals.
you can say this same thing about Airbnb. how many listings are illegal? and yet they don’t proactively purge illegal listings unless some city aggressively goes after them in court.
None of AirBnb listings were likely illegal in the early days, because what they were doing was not regulated.
Also, I’d say there’s a big difference between the homeowner choosing to list their home against a city regulation that the homeowner is choosing to violate, versus users uploading content that they don’t own in the first place.
Even a tenant who is AirBnb’ing their unit against their lease terms is at least listing their own unit / choosing to violate an agreement that they themselves have signed.
AirBnb to me is unique in that their regulations came afterward to combat the monumental rise of short term rentals, whereas Section 230 / DMCA regulations are what enabled the other sites to ultimately growth hack on the back of infringement.
I mean, it depends where you were, I suppose. In many places, Airbnbs were illegal from the start (at least for the whole-home rental type; the rent a room type are much fuzzier), as they breached planning rules that only allowed residential use for particular buildings etc.
it’s not a big difference. both platforms are looking the other way when it comes to policing illegal content on their platforms because they make money off it whether it’s legal or not. they are choosing to offload the responsibility to the users as much as possible.
don't think majority was stolen content. but a lot of amateur content was on there. were people uploaded their videos etc without risk of getting identified. it's a hobby to some people. like the article states, majority of the content wasn't even porn. But yeah, open upload platforms will end up being abused e.g look at how unlimited amazon cloud storage got abused.
To some extent, education in schools in regards to consent, relationships, and how issues like this effect people could help to reduce the number of incidences of revenge porn. It isn't just harmless fun, it has very serious consequences.
Sadly, it'll be impossible to stop it entirely with the unverified upload model.
>Their entire business is built on stolen content uploaded by anonymous users
I don't know, whenever the topic of Kim Dotcom/Megaupload comes up people on hackernews act like he has done nothing wrong and is a victim. Now the community is against a business being built on uploading stolen content by anonymous users?
> I don't know, whenever the topic of Kim Dotcom/Megaupload comes up people on hackernews act like he has done nothing wrong and is a victim
Of course he has done wrong, but the fact why people are standing up for him is he did not do it while being in the US. It is one thing if the US asks for extradition of a criminal who committed the crime in the US, but an entirely different thing if the criminal committed the crime in another country - and that doesn't even answer the question if what the criminal did is actually a crime in that country.
It's already a legal stretch to arrest and punish pedophiles who fly out to Asia to rape children (because enforcing the laws against child rape should be something for the destination countries!), but the Dotcom and Assange sagas should not result in extradition.
Especially since Kim Dotcom isn't an American citizen neither. So a third country citizen did something in another third country that the US didn't like. Quite a thing to aks for extradition in that case, especially from a country that doesn't recognize the court in The Hague.
I am still not what to think about Assange. WikiLeaks did great things in the early days, and I was half convinced the affair in Sweden was a set-up. If it was not, Sweden would have been the place to put him on trial. And not to use it as am excuse to arrest in Britain to have him extradited to the US.
That being said, it seems WikiLeaks was at least complicit in Russia propaganda operations. And we have no way to tell whether that would have happened without the arrest. Complicated story. And still a trial I hate to see, if Assange gets convicted, there is no way to be sure that doesn't happen to other journalists as well.
it's a reasonable mistake to make, but it's probably not correct.
dang made a good post about this recently. I'm having a hard time finding it now (he posts a lot), so I'll just paraphrase. a lot of (sometimes vile) positions are strongly advocated for by a minority of users, but only weakly opposed by the majority. that vocal minority will upvote any friendly post, while the majority will just shake their heads but not necessarily downvote every single post.
so it's actually quite common for the top comment to be against the aggregate opinion of the userbase.
Edit: Surprised it took almost twenty minutes for the downvotes to start. It also proves my point perfectly. For what it's worth I'm not saying bicycle licenses are good or should be implemented. But it's the most well-understood example of this phenomenon.
This is the argument about why "bicycle licenses" aren't a thing. The number of drivers who aren't cyclists dwarfs the number of cyclists by many orders of magnitude in most US states. In many municipalities, some cyclists regularly break traffic laws and cause (at best) frustration or (at worst) deadly accidents. This leads to drivers-who-aren't-cyclists making the seems-reasonable comment that you should need a license to drive a bike the same way you do for a vehicle.
So let's say a politician supports bicycle license. This isn't enough to get anyone who supports a bike license to vote for them, because it's not a Top Ten issue for anyone on that side. But not only will every cyclist in their district vote against them, they'll campaign and donate to any opponent of that bill. If they even think you support the legislation they're writing checks to your opponent.
I'm not making an argument on bike licenses in particular but I can see a similar effect where a minority percentage of HN feels strongly enough about a given topic that they upvote the articles based on title alone. When an issue is particularly divisive, the odds increase that people feel strongly on both sides so you end up with the #1 posts on two different days expressing diametrically opposed positions, but both very heavily upvoted.
Well, your example also isn’t up to democratic vote. There may be more compelling reasons why bike licenses don’t exist, like how it’s not worth the trade-off of liberty, how it wouldn’t achieve anything, you can already be prosecuted
for causing an accident, the opinion of car owners against cyclists is usually just irrational hate rather than deliberated policy ideals, etc.
It's not that consistent which stances get upvoted. Different stories attract different readers, and comment upvotes depend on the quality of the message as well as the content.
With all this talk about various corporations policing copyright, censoring political content, and whatnot, I think it's only a matter of time until someone makes an anonymous blockchain content aggregator that's impossible to censor.
Then when it becomes the modern equivalent of Usenet/IRC/Napster/PirateBay, people will just shrug and say "don't blame the protocol".
The demand exists and the technology exists, but nobody has put them together yet.
What if it was only a directory of torrents? Not hosting the content itself, but indexing it.
(I've never developed anything with blockchain or torrents, so my apologies if there's a technical reason why this sucks.)
If I understand Pirate Bay and other torrent sites correctly, they don't host any content. They just index what's available P2P.
I'm not sure how large that index would become, but I could imagine chunking the index itself out so that it doesn't take much room on any one user's machine.
The chunking of the index could theoretically be done in such a way that no individual chunk would have any useable information, so no individual user could be accused of linking to illegal content. (I think that's conceivable? Again, not sure.)
(Please note: This is pure hypotheticals. I have zero interest in seeing such a system come about.)
Indexing has always been the easy part. It’s the hosting that’s the real problem.
With hosting, you can either have it centralised so users are anonymised from each other (but it’s easier to take down centralised repositories) or decentralised but where users can see each other (but then it’s easier for copyright holders to go after individuals). You can’t really have it both decentralised and anonymous.
There's no such thing as 'impossible to censor'. It may eb 'hard to censor', perhaps even 'impractical to censor', but a powerful enough government will be able to censor anything it wants. Do your think that, if people started somehow publishing 'bad' stories about the CCP in the bit coin block chain, the CCP wouldn't act to remove that content or censor any acces to the block chain itself?
CPP might be able to ban access to the block chain, or some part of it, but they wouldn't be able to replace it with their own version that was accepted outside China.
A blockchain doesn't make it impossible to censor.
To be honest, I wonder why western governments don't ban cryptocurrencies.
They make tax collection harder and help criminals collect money.
Plus that would be a blow for countries that are investing tons of money in it eg China, which is destroying the rest of the world, economy-wise.
There is the possibility that western governments are already under Chinese influence, I guess, but I wasn't definitely expecting BTC to survive so long.
Especially because governments are not even benefiting from cryptos.
Torrent aggregator sites/discords have small groups of people behind them to arrest or sue. A blockchain would have a large impractical group of people to arrest or sue, it would be like when the RIAA tried to sue Kazaa users.
It’s unclear how this is true. You would still need tools to search the blockchain for the content you’re looking for. Someone would also need to curate and coordinate the format for embedding this information. Anyone involved with either of those steps could be targeted. Those are just off the top of my head. There are likely more holes than that.
It’s not a tech issue. You can target anyone that’s part of a disaggregated network and broadcasts it. That’s true for torrents, TOR nodes, or blockchains.
What happens when someone uploads their illegal porn to the blockchain aggregator and it can't be taken down? It seems like there's a strong need to censor things like video and photo sharing.
What you describe sounds alot like YouTube. 12 years ago, some friends of mine were doing their entire home music playback with a playlist of stolen music uploads. I'd even go as far and claim that YT Music wouldn't exist if people were not trained that YouTube has all the music content they are looking for.
And then there are channels like "Netzkino" who have around a million subscribers and a few thousand full movies online. The claim is all that content is supposedly legal. Well, maybe. But it doesn't look like it...
This is interesting considering issues with CSEM far and away plague social media websites significantly worse than porn websites. Credit card companies are super eager to drop porn and sex related vendors, in the United States. The stated reasoning is that this kind of stuff leads to a lot of chargebacks/disputes/fraud, but I suspect this is not the full story, and this news doesn’t do much to quell that concern.
I’m not saying there is no problem whatsoever and nothing should be done. However, Pornhub has come under increasing scrutiny despite an underwhelming amount of evidence that it has a serious endemic problem. Such illegal material could easily end up on any website if it is mistaken to be legal, which unfortunately does happen; it’s not very easy to tell if someone is biologically 18, you are likely underestimating how much the human body can vary from person to person if you think that it’s easy. But, while it’s not a good thing, the bigger picture is that Pornhub is largely not being used to intentionally distribute illegal material, whereas social media and chat programs actually are. Sometimes surprisingly close to out in the open.
Something feels disconcerting here. It’s not that I care personally about this particular action; in truth I have almost only ever seen meme videos on Pornhub. But it feels like it could be a precursor to another push to censor pornography and the internet. Somehow, when eroticism is involved, people either cheer on or simply are ambivalent about censorship. I think this is irrational. The edges of free speech are the only parts that really matter that much. This action here doesn’t really hinder anyone’s expression directly, but it certainly seems like it will have chilling effects that justify future actions that will.
Of course I hope this is mostly nothing and it ends up being purely positive. However, I at least implore some thought here as this didn’t just occur in a vacuum.
>Something feels disconcerting here. It’s not that I care personally about this particular action; in truth I have almost only ever seen meme videos on Pornhub. But it feels like it could be a precursor to another push to censor pornography and the internet.
I would go even farther to worry that this might begin to push the Overton window towards demonizing any content posted online that isn't "verified" by a real identity. MindGeek (the company behind PornHub) was also backing the UK's law requiring an ID to watch porn online. If users get comfortable with the idea of having even their porn tied to a real ID, wouldn't they become comfortable with less "private" content also tied to their legal identity?
I don't want to read this headline from the future: "Pornhub requires an ID, should Reddit, Twitter, and 4chan do the same?"
And I also don't want to read the future headline: "Pornhub's ID database hacked, thousands of models names and addresses exposed"
Anonymity is important (imo) and it would be a shame to see people use porn as the excuse to curtail it. Surely it's possible to remove illegal content without stripping away privacy and anonymity.
> Pornhub's announcement also cites a report by third-party Internet Watch Foundation, which found 118 instances of child sexual abuse material on Pornhub in the last three years, and notes that in the same period, Facebook's own transparency report found 84 million instances of child sexual abuse material on the social media platform.
There was also such an issue with it cropping up on Tumblr and not being removed in short enough order that Tumblr was briefly removed from the iOS app store.[1]
Yeah I read that, but I don't think that single statistic necessarily tells us a whole lot, the extreme disparity also makes me think there might be something unexpected going on underneath the tabulation of those numbers. Intuitively, I'd expect user upload porn sites to have a lot more illegal content since its much easier for it to blend in there compared to most other places on the internet, the problem is greatly magnified at scale since distinguishing between legal pornographic imagery and illegal pornographic imagery is not trivial problem.
Well my intuition says the opposite and that those numbers seem to track with my expectations.
A large bulk of users likely never sign up for PH, making it harder to vet individuals to make sure they wont report the content. People show up with pretty much one thing in mind when going to PH and it's not really to interact with communities. Facebook has so much reach and such a large userbase that it's harder to moderate, and it's used for exactly what these groups want to do.
If we were only talking about close to 18 yo CSAM, sure that I could believe might be something that is more common on PH. But when looking at the entirety of CSAM, I just find it hard to believe that people would risk uploading a video that clearly has a child in it to a service that actively moderates video content primarily to a community that, en masse, doesn't like the content.
There could be some tabulation issues, but I find it hard to believe it's so poorly counted that it would make up a meaningful difference between the two numbers provided.
- Most popular social media websites allow adult content in some forms.
- It is no easier to tell illicit content from legal content on one site versus another.
- There is likely not just far more content, but far more adult content on Twitter and Facebook than dedicated porn websites.
- Sites like Twitter and Facebook better facilitate private conversation (presumably.)
I don’t think you can make money on Pornhub without verification, so there’s probably not a money incentive to intentionally post illicit content. In fact I’m not entirely sure why you would at all, but my best guess is in hopes of trading with others. In that case, social media is actually perfect: draw people in with carefully constructed fronts, do dealings in private groups and try to fly as much under the radar as possible.
It’s hard to say and probably silly to speculate but I just don’t see the point of using a pornography website for this.
Most popular media sites do not allow adult content, nudity or other forms. Instagram,tiktok,facebook,apps stores while others like youtube may allow some suggested content if logged in but many would not classify these as porngraphic but more content geared towards adults. Whatever porngraphic material can be found is in the process of being removed.
I was talking about Facebook and Twitter. Admittedly, I am used to the world where these services basically define social media. I think I am a bit too old to be in the demographic TikTok seems to be appealing to.
Still, Facebook and Twitter just have way more content and users than porn websites, and it’s going to be easier to fly under the radar. Twitter accounts with fewer than 500 followers are basically invisible.
Of course, banning pornography really does roughly solve this problem, but it’s a pretty nuclear solution... (I would prefer a world where we’re less weird about erotic content, not more. I get why it’s complicated, but I still think we can do better.)
But what sense does it make to blend in anything? If you are a pedophile you want pedophilic content, not blended in, difficult to tell apart, perhaps illegal content. And if the uploader is a pedophile why'd he upload to such a public and traceable place instead of using some private group or some kind or anonimizing software like Tor? Doesn't make any sense. Pedophiles are not out there begging to be arrested.
The numbers that are reported are reported by the company. When Pornhub says "118", they're telling you they really didn't try particularly hard. When FB says 84 million, that tells you there's a whole team writing algorithms to detect and report. It's nearly impossible to determine the actual numbers
Those numbers actually prove that PornHub is doing next to nothing to police its content. And the average video posted to PornHub is probably far, far more likely to contain CP/Revenge Porn than the average social media post.
It's surprising to me (perhaps shouldn't be) how many people here are discussing this issue in the abstract using terms like "broadly harmless." To these commenters: are you unaware of the issue of child porn, revenge porn, stolen content* and videos of actual rapes being uploaded to Pornhub, and Pornhub making money off of this content?
If you are aware of this (the core issue), you should be grappling with these issues rather than ignoring them. If you're not aware of these issues (the core issue here & the motivation for the actions of the CC companies), please read[1] & build that complicated issue into your view of the topic before commenting. These "what's the harm?" comments are really stupid, to put it bluntly.
* and I don't care about DMCA & intellectual property, I'm talking about someone having their phone stolen & private pictures taken, or sending pics to a lover in confidence and having him humiliate her by sharing it online & telling friends and she can't get the pictures taken down. This isn't about IP it's about peoples' lives.
1) How wide do you cast the net for accountability?
Bad content is a subset of unverified content, unverified content is a subset of Pornhub, Pornhub is a subset of an ISP's traffic and a subset of CC companies' revenue. How much good do you want to ban with the bad?
2) Why not actual law & order?
I would guess a lot of fans of unverified content also dislike bad actors, and would like to see them investigated/prosecuted, rather than 'deplatformed'. 200 OKs for the good actors and gaoltime for the bad actors, rather than 401s for everyone.
Pornhub and the (formerly named) company Manwin grew their business mostly on pirated content only to buy large porn studios later on with the gains. It surely helped that the industry wasnt too litigous and judges surely were reluctant to support such cases where the whole industry may be perceived as morally objectionable and indecent by parts of the society. Der Spiegel Magazin has an 8 year old article on Pornhub and its German founder, it's lengthy but worth a read:
To these commenters: are you unaware of the issue of child porn, revenge porn, stolen content* and videos of actual rapes being uploaded to Pornhub, and Pornhub making money off of this content?
If you are aware of this (the core issue), you should be grappling with these issues rather than ignoring them.
Sure I’ll answer your questions, but you go first & answer mine :)
Agree that jail is better, if that’s an option I’ll take it. But if the options are “let a company make money off rape/cp/etc” or don’t let them do that, I’ll take the latter.
I don't think you're wrong - I think you're misunderstanding the scale of what's about to come.
We can already generate realistic video/audio content based on a very small sample.
We're going to get custom porn in the vein of celebrity fakes, but for EVERYONE. It was limited because it used to take time/skill to make. That's changing incredibly quickly.
I'm fairly conflicted about how I feel about it, too.
I'd wager a non-trivial chunk of my income you'll be able to upload a photograph, pick a category, and have custom generated porn some time in the next 10-15 years.
How does this have anything to do with REAL rape and REAL stolen video and images being uploaded NOW? Just because you can theoretically, in the future, make a fake video of someone having sex doesn't mean it's in any way justifiable to allow actual video of real violent crimes on your platform now. And even when this future does happen, do people no longer have a right to their own image and likeness? Is it okay simply because it's fake?
Is it okay to imagine? To think whatever you feel like to think at the moment? Rhethorical. The future that gp describes simply helps with it. Not questioning real harm to somebody ofc (as in physical/personal harm).
People never had a right to their image, because it is neither unique, nor is really theirs with all these enhancements that we call “image”. Nobody can dictate you what to think. One little step from here and we’ll quite seriously discuss thought crimes.
There is a world of difference between what you imagine and creating a digital likeness of someone else, that will almost certainly then be distributed. Comparing the to and comparing the banning of them to thought crimes is incredibly dishonest. And yes, people do have a right to their image and likeness, those rights are protected by law.
What's your point? What does this have to do with PH hosting (and dragging their feet about taking down) and making money off of real rape/revenge/child porn today? And out of curiosity, what makes you think I am "misunderstanding" this? You reckon I wouldn't be mad about PH making $ off rape videos if only I knew that in 15 years we could make deepfake porn for cheap?
"In the future, everyone will be world-famous for 15 minutes" great but what's that got to do with the price of tea in China?
I mean that all the things you're upset about are going to happen at scale.
Upset about revenge porn? What if it's just a really close copy of your face/body generated by machine, stitched together in a very believable fashion? You could maybe claim ownership of your likeness, but I don't think it's going to be possible to really enforce - Many people look near identical already. Or they add a mole here or there and claim - look - clearly not you!
And it won't be you, but does it really matter? How do you prove it's not you to other people without stripping down?
Upset about rape videos? I am too when it's a real person. I find it much harder to argue against when it's a performance (current porn) or when it's not any person at all. Who has standing in court? No one was harmed. Are you upset that someone was raped, or are you upset that someone is making money off of rape videos? Because there will be generated videos of this content, and it will absolutely be customizable to look like people want it to. Does it matter if the video is real? And removing this content won't stop rapes at all. So is it somehow worse if a real rape victim has fake rape porn made of them? If so, why?
---
Basically - My point is that you're making an appeal based on a moral system that I don't think is going to survive the coming wave of generated imagery.
I too have serious issues with all of the real crimes you dislike, but I don't think you can genuinely equate video content with the crime itself - They happen to be tightly coupled now, but that's going to change.
This is NOT an open and shut victory here. I'm not really sure there's a victory to be had.
Maybe the best we'll get is a world where even when real revenge porn is uploaded, no one will believe it's real...
> Upset about revenge porn? What if it's just a really close copy of your face/body generated by machine, stitched together in a very believable fashion? You could maybe claim ownership of your likeness, but I don't think it's going to be possible to really enforce - Many people look near identical already. Or they add a mole here or there and claim - look - clearly not you!
People make all kinds of claims all the time, including transparently self-serving ones like your example, but that doesn't mean anyone buys them.
I'm pretty sure what's going to happen is the law will make it clear that porn generated from your likeness and distributed without your consent is as illegal as revenge porn. Then a bunch of people who did it anyway will go to jail and that will put a strong chilling effect on the whole thing. In most cases, the people with the motive an ability to make machine-generated porn of a particular person will be relatively easy to find.
Because we've seen this so clearly with fake celebrity porn?
And those are folks with the actual resources to go after it in courts.
> In most cases, the people with the motive an ability to make machine-generated porn of a particular person will be relatively easy to find.
I just find this incredibly difficult to believe, unless this "ban on user content" extends to a ban on general computing. And maybe that's where we'll end up, but I think that's a much more malicious result than generated porn. Or maybe anonymity will disappear online, which I find fairly scary, but is another clear thrust we're seeing.
Also - What about the absolutely bonkers industry that will be consent based generated porn? We're already headed that route now with 3d scanning of actors/actresses, and lord knows - I think a lot of folks in the porn industry would be happy to get digitally imaged and then not have to actually engage in the physical process of making porn.
---
I think this is a real and genuine ethics question that we don't have good answers to right now.
I don't really have a problem with PH pulling down anonymous user content (hell - private company, they can make their own decisions) but I also don't really think it's going to stop this kind of behavior, and it certainly doesn't untangle complex social questions around what's right and wrong in this space.
I will say - I think we're undergoing another sexual revolution right now, it's just more subtle because it's less about what folks do or wear in public, and more about what folks do in private.
I'm not entirely sure how you legislate on this topic either - What's art? What's porn? can you really "Know it when you see it"? What if you unintentionally make imagery that just resembles a real person?
Those are all questions that depend on social context at the time they're legislated on, and they set a tone going forward.
Basically - My whole point is that this is not some clear win, and again - I'm not entirely sure there's a win anywhere in this discussion. I think this is just PH skirting around the issue for monetary reasons because they happen to be big enough at this point that they can do without anonymous submissions.
I will say - I'm fairly sure any "it's simple and this will solve it!" response is utter bullshit.
>> I'm pretty sure what's going to happen is the law will make it clear that porn generated from your likeness and distributed without your consent is as illegal as revenge porn. Then a bunch of people who did it anyway will go to jail and that will put a strong chilling effect on the whole thing. In most cases, the people with the motive an ability to make machine-generated porn of a particular person will be relatively easy to find.
> I just find this incredibly difficult to believe, unless this "ban on user content" extends to a ban on general computing. And maybe that's where we'll end up, but I think that's a much more malicious result than generated porn. Or maybe anonymity will disappear online, which I find fairly scary, but is another clear thrust we're seeing.
You're not thinking about this clearly, and are veering off into overreaction and catastrophizing.
Machine-generated porn of celebrities is far harder to police, because they're celebrities. There are thousands, maybe millions, of people who have the desire and technical means to make porn of them.
For regular people, the story is different. For the most part, no one's going to want to make porn of some random person they don't know. The only people with an interest in making or consuming machine-generated porn of a particular normal person are people who have some connection to them, which is a far smaller group. That makes enforcement possible with regular investigative tools. Enforcement doesn't even need to be 100% to be effective at creating a chilling effect, just make examples of some bitter and angry exes and creepy stalkers.
Celebrities are also a pretty rarefied group. Once this problem starts affecting everyday people, there will be more electoral pressure for dealing with it legislatively, just like with revenge porn. At least some cases of "leaked celebrity sex tapes" would probably count as illegal revenge porn today.
Reminds me of the Congress movie, based on The Futurological Congress but only slightly as it goes its own direction. In it actors sign off on their image to the movie studios, the studios no longer need the actors only the rights to put them in movies. They can create them in anything they want, always young, always available. This gets extrapolated out further as the film progresses.
How common is child porn on there? There have certainly been some horrifying cases. But, how common is it? How fast is it being taken down? And how long ago are the incidents being referenced? PornHub has done a lot in the past year to crack-down on problematic content, even before the purge.
He focuses a lot on "things promoted towards pedophiles". But, he admits on many occasions, that the vast majority of the content (even in the problematic keywords) isn't problematic, and therefore, might just be simulated.
He hasn't said how common the child porn is. He assumes there is a lot of it, although all indicators show it is low (there may be a lot more which hasn't been detected, but he seems to have dug really deep into the site).
This isn't to excuse someone posting child porn or running a site which is likely to get such content with insufficient moderation. However, how big is the problem on there? Is there anyone who has quantified it? I am very curious to know.
* I am not familiar with PornHub or those keywords. I am just going by what other people have said about this issue.
Given PornHub's actions, it seems not even they could given an confident answer to that question. PornHub may have stepped up its efforts but it has been mostly a reactive response to media pressure (which has been mounting), not a proactive one.
Even the core business of PornHub isn't exactly harmless. I don't mean in a religious/moralistic way but for the same reason eating at McDonald's every day is probably not going to be great for you.
Exactly my point. Submit take-down requests and if the site owner does comply, then enforce penalties. But this is not the case we are discussing here. Here the site was sentenced to death regardless of its compliance.
Revenge porn, rapes and other bad stuff are being uploaded to Facebook and Google too and they make money out of it
Pornhub's official position, like Facebook and Google, is to have AI algos and teams of moderators to manually remove such content.
(According to the article, Pornhub only had 118 incidences of child porn over the last 3 years whereas Facebook had 84 million incidents of child porn during the same time)
I'd say google/youtube is much better at detecting known content actually, though they have a notoriously high false positive rate. Pornhub has a notoriously high false negative rate.
In my mind the question is whether all of the relevant sites are making reasonable efforts in getting rid of illegal material. "Reasonable" here is probably best determined by courts, and I would dismiss any comparisons between sites abilities. If Google, a world leader in AI technology, has a better filter that doesn't mean anyone else's efforts are unacceptable.
The question I have is whether such systems ever truly existed at pornhub. They're obviously shady as hell and I wouldn't put it past them to straight out lie. Google's systems are known to exist because they have a high false positive rate. How do we know pornhub gave a single shit about any of this before the NYTimes oped?
I agree, that is a very good question. I would also suggest that we should not take for granted the word of a media organization who wishes to get clicks or even a prosecutor who wants to villainize their target for political points.
you shouldn't arbitrarily target only some companies. my point is that facebook/google are the worst when it comes to revenge porn as it's easier to destroy someones' life there.
But for some reason, instead of focusing on facebook, they focus on pornhub because it is an easy target and easy to say we've done something.
> The Internet Watch Foundation couldn’t explain why its figure for Pornhub is so low. Perhaps it’s because people on Pornhub are inured to the material and unlikely to report it. But if you know what to look for, it’s possible to find hundreds of apparent child sexual abuse videos on Pornhub in 30 minutes. Pornhub has recently offered playlists with names including “less than 18,” “the best collection of young boys” and “under- - age.”
Is there any information that IWF has done more sophisticated audit than relying on user reports? Have they used advanced "fuzzy" searching algorithms that Facebook uses to identify known videos that have been modified in some way to get past filters?
Then maybe Facebook and Google should be taken more effort to prevent this content from making it on their platforms. How is it happening more often elsewhere a justification for it to happen here?
The amazing thing to me is how the far left and far right seem to converge to the same censorship solutions, despite the fact that that they have completely opposite ideology.
I was laughing at the UK when it tried to ban pornography sites on the grounds of "morality". But it was the NYT who achieved it first.
The result is the same: I don't like some of your ideas, so I will shut you completely, just in case.
For the particular case of Phub, we should fine it, if it did not comply to takedown requests. But if it did, what is the accusation exactly? That it missed 200 videos from 18M ones?
What is next? Shut down the ISPs that profit from this illegal content in their networks? Hard disk companies that store the content? Where does this end?
Horseshoe-theory is indeed a thing. As for where it stops; we've already seen Mastercard/Visa force platforms like Patreon to blacklist users, Cloudflare pulling routing to sites, registrars willing to essentially dox owners of 'bad sites', social media blacklisting links to stories hosted on gov websites, Chase-bank closing the accounts of people associated with controversial political groups..
Kind of makes you miss the good old days when companies were just mindlessly greedy rather than pushing morality/ideologies.
Since you know the phrase "horseshoe theory" I'm a bit surprised you don't know that no one actually on the "far left" in USA proposes censorship. Instead it is a particular flavor of centrist identitarians who couldn't care less about freedom or class consciousness. They're just striving for a bit of attention like everyone else. If you accused them of leftism they would probably take offense.
I'm not saying the far left don't have it in them, because obviously they did USSR etc. But that was in that context, in which by some miracle they controlled the murderous tools of a nation-state. In USA they know very well that they're in line for censorship (or whatever, really) immediately behind the pornographers. No one who appears on cable news is part of the "far left", no matter how loudly they shriek.
I have seen seemingly infinite demands from liberal Democrats that big tech shut down covid disinformation and election disinformation. Surely you are aware of this. Is it your position that this isn’t censorship?
Please read more carefully. You have seen that infinitude of demands on American cable and network TV news. I specifically identify that medium as antithetical to the far left. American conservatives seem not to be able to differentiate e.g. Adam Schiff from Michael Parenti. No human capable of respiration is actually that stupid, so this tragic misunderstanding must be due to the ignorance that is carefully fostered by American news media firms.
The people who advocate censorship vote for all the same wars and inhuman abuses that you vote for. They're closer to your position than to mine. If they have anything to their credit, it is that they are less explicitly racist.
The policies of the True Scottish leftists you hypothesize make a great deal of sense to me. I'm not sure they can wriggle out from under the twentieth century. Not all of the excesses of USSR and communist China can be blamed on Western imperialism.
How is the far left _or_ the far right relevant in this? And how does the UK government _or_ NYT belong to either? Mind you that this is MasterCard and Visa who have forced this move on PH. Now, of course it's still a political issue, but internet censorship transcends the political spectrum
This is undoubtedly a positive move, though I do not believe it will be enough to satisfy the likes of the credit card companies.
They are requiring ID verification (i.e. that you're 18 or over) to upload content. As much as I am privacy-minded, the idea that someone could upload an explicit video to Pornhub without verifying that they were 18 is mind-boggling.
This is also a net positive for content producers on Pornhub, who often see their content pirated on the very same platform.
There are other venues for this: cam sites, onlyfans, deviantart, fetlife, etc... which by and large put the subjects in the driver's seat.
But a lot of those "casual exhibitionists" are unverified. Many of the photos may have been taken with consent, but are distributed without. Sometimes that's revenge porn, sometimes it's hacked/jacked content, etc. Distribution of questionably sourced content can be quite damaging to the subjects.
If the "casual exhibitionist" is harmed here, it's dwarfed by the harm already being done to subjects without consent. And there are still plenty of suitable venues out there. Including verification with pornhub.
Pornhub actively, and knowingly or negligently, facilitates it. They're accessories to the act. It doesn't matter that it could happen anywhere. It's been happening on pornhub and until now they've been reactive (responding to takedown requests) instead of proactive (requiring verification of all content).
Newspapers, magazines and broadcast news have been actively, negligently and often knowingly defaming people and organizations for decades (centuries?), sometimes with very serious consequences for the offended parties. I've not heard anyone calling for Visa/MC to cut them off.
If there is a legal case to be made against Pornhub, then the authorities should make it. Otherwise they would do better to go after those who made (or uploaded) the offending material in the first place.
Is it still a thing? What malware can a pornsite contain that cannot be uploaded to or linked to from a regular smaller niche or bigger broader site? Do you download and run hotgirls.exe dialers or what?
And spinning up a new video host (that you intend to make money somehow) isn't cheap or easy.
Nor is it terribly safe right now legally speaking. Considering the child porn castoffs from pornhub will be out looking for a new place to upload their trash. Eventually, not only child pornographers, but so many other undesirables are on your site, that it's just dangerous to continue running it from a liability perspective.
> Countless lives have been destroyed because of revenge porn uploaded on these websites.
I suspect you're getting a lot of downvotes for this.
I don't know how one's life could get destroyed just by uploading a sex tape to a porn website. It's more the sending it to friends and aquaintances that causes hurt (and for the vast vast majority this would hardly "destroy their lives", very few people are that fragile), which can be accomplished easily enough without the porn website.
> I don't know how one's life could get destroyed just by uploading a sex tape to a porn website
You people have zero empathy provided you get your porn.
You don't think that a victim of revenge porn feels like being raped everytime a stranger jerks off on her intimate image/video uploaded without her consent and it doesn't take a toll on her psyche thus destroying her life? Then shame on you.
Absolutely this. The benefits of "casual exhibitionists" absolutely do not outweigh the risks and costs of having unverified and possibly illegal porn on your site.
Maybe I am misunderstanding something, but I feel this could be a major privacy negative if this means all platforms require identification verification (i.e all sites with an account require a cell phone number or address etc.). Currently, I can post to reddit or hackernews without my profile saying who I am or who I work for.
Of course anonymity provides an environment prone to misinformation and exploitation, at the same time, so having verification on a case by case (site by site) basis would be my vote.
>As much as I am privacy-minded, the idea that someone could upload an explicit video to Pornhub without verifying that they were 18 is mind-boggling.
You make it sound like the statement requires no evidence. In reality it makes no difference if the uploader is 18 or not, but that the _recording_ is made consensually by 18+.
It is not a net positive for content producers, because viewers of legal but unverified porn (ie the wast majority of the content) will go to other sites.
I suspect xvideos viewership will go up, and they probably won't ally with the credit card agencies.
A service that would allow users to upload their KYC info, and then provide them with unique tokens per site that can be used on 3rd party sites as nonces would be a decent way around having to verify on each site.
Or just let the whole middleman as a service industry keep rotting, and direct your technical energy towards rebuilding services on distributed technologies that won't succumb to the whims of the status quo power structure. What made the Internet great was being an unregulated frontier, and we can see that everywhere the legacy system jams its tentacles ends up turning to commercialized milquetoast shit.
Power is important, but without economics, it's just an academic research project/network.
Yeah, that made the Internet great in the beginning, and there are projects that don't depend on being "monetized", such as the OG hardcore F/OSS ones (GNU, Debian, KDE/Gnome, CCC, Wikipedia) and their ideological offspring (peertube, mastodon, matrix.org, blender).
But the average user doesn't care and doesn't understand the importance of having a free as in "as independent from the powers that be as possible" platforms.
The "average user" of 20 years ago was watching TV news, TV entertainment, and choosing their goods from what their local stores decided to carry. Similar arguments were made about how they wouldn't want to do a bunch of technical stuff, yet today digital technology is just part of everyday life. If the middleman as a service industry could have held itself to ~2010 levels of intrusiveness, then I'd say perhaps there might not be enough of an advantage to pull people into newer technologies. Alas, every bit of top-down meddling seems to beget even more top-down meddling, as the powermongers discover that the nouveau middlemen can be pressured just as easily as the old set.
> ... yet today digital technology is just part of everyday life.
In the dumbed down form. TV 2.0 (Netflix), grocery 2.0 (Amazon), news 2.0 (whatever YouTube/FB/twitter recommends).
The same people who did not care about stepping outside their comfort zone to get a bit more out of those aspects of life still don't do. (And sure, on one hand it's completely understandable, many people don't have the need, ability, resources and time (the good old privilege) to even do so; and ... there's no other hand, even if we know long term being closed in a walled garden will lead to problems, it's hard to make people understand it when they are inside.)
If anyone should be holding them to account, it should be the courts and / or society. Credit card companies shouldn't be able to play judge, jury and executioner. If our laws are insufficient to deal with the problem of major tube sites, we can enact new ones.
This might not be so bad for mindgeek. They just cut their storage costs in half, probably mostly cutting out the long tail end of videos that they don't really need. They just made a strong argument for increased regulation that will hurt any newcomers and hurt social media competitors more than them. They just got a whole lot of free press.
I'm not so sure about cutting the storage costs at this point. They likely soft deleted the content in order to preserve it for potential law enforcement needs or whatever else. Moving forward they will absolutely see a decrease in storage needs since uploads are so much more restricted
It's not even soft-deleted, it's just a "this content has been flagged for verification" UI message. The content is still available for download and/or streaming with other frontend clients, such as youtube-dl.
Hot-store is more expensive than cold-store. For law enforcement purposes, they can have it shipped out to magnetic tape and dropped in a dry cave somewhere.
Yeah, my first thought was "What will become the next leading porn site on the Internet" when I found out about this. It's like direct download sites. If a direct download sites cracks down on illegal content, people start uploading files somewhere else and suddenly all of the visitors, some of whom would have paid for premium membership, are gone.
The real reason this is good for mindgeek is that they own a bunch of the porn production companies too. Making pirating and amateur uploading harder will lead to more people paying both pornhub and the subsidiary porn companies.
Absolutely not, this is an existential crisis for Pornhub. They need the credit card companies now since they're a huge operation that needs to diversify revenue, but a leaner upstart could just survive off of ad network and affiliate payments.
It would be interesting to know how many of these vehement anti-child-pornography activists people are secretly pedophiles themselves, just like it was shown to be true for homophobes.
Posing as an "anti-pedophile" would be a good way for a pedophile to deflect attention away from themselves, or as a way for them to project the negative feelings they receive outwards.
There have been cases where child rights advocates have been arrested for multiple counts of child rape. I don't think every activist is one, or even most. They probably genuinely feel for the victims, and their suffering.
I think there is much more hatred for "child" pornography because it is completely natural for humans to be attracted to post-pubescent humans, because biology. But they are also classified as "children" arbitrarily and this "wisdom" is widely accepted and hence the natural attraction shadowed. It is then projected on to celebrities or criminals when they are caught.
When you talk to Americans about guns or fast food or healthcare or 101 other issues, Freedom is a word you hear a lot. But for porn and alcohol, despite broadly harmless, Americans are all about bans and prohibitions...
America is an incredibly diverse country, painting all of us with broad brushes is almost always doomed to be incorrect.
For what it's worth, I for one absolutely support the inalienable right to keep and bear arms while also supporting the right to create and watch pornography or drink alcohol, even though I understand none of these might be considered "harmless".
The root cause is usually a thing that creates these circumstances and bans things for coping/breaking up with it. I doubt that we can eliminate it, because it helped with species survival for too long to be forgotten in a moment. At least not in our current form.
That all depends what you compare it to. 40% of the US population has obesity. I doubt 40% have a drinking problem. Yet any discussion of banning junk food is verboten, and no one seems to care that the US drinking age is way higher than most similar countries...
I agree that child porn is not harmless. It's likely the most harmful segment of the pornography industry. But it's also disingenuous to characterize all pornography as child pornography.
I'm surprised at how much impact the NYT's expose (I would personally consider it more of a hit piece) has had. I naively assumed that PornHub, as a top-10 site on the web would be immune to such a piece, but I guess I was wrong. How many of the members of the NYT and the Visa and Mastercard boards who created on this outcome watched the videos that are now down? Probably a majority of them, but that wasn't enough to stop the moral panic I guess.
As a side note, I think Reddit should now tread very carefully. PornHub had a lot more verification in place for people uploading porn, and even they've now been neutered by a hit piece. Hell, Reddit is somehow still in the iOS App Store despite having copious amounts of porn (and yes, underage porn too even if Reddit will deny it). They're just an opinion columnist's penstroke away from the mob coming to roost.
These seem all very different critiques. The Nick Kristof article is based on the work of Laila Mickelwait, who has for the last couple of years worked tirelessly on her campaign to shutdown Pornhub. I twas her who has surfaced all the victims in the article.
Though I doubt she will succeed now, she is very clear that the end-goal is not to monitor Pornhub, but to criminally charge it's owners. This is because a criminal case against them, similar to Backpage, would essentially make all tube-porn sites legal unviable.
I really quite loathe this line of argument that the source or even the intentions behind a probably factual statement should be taken into account when evaluating the truthfulness of the statement. It's textbook ad hominem, literally argument to the person.
PH has a demonstrable problem with child porn on its service. I don't care if it's the US government, a different government, a religious fundamentalist, a competitor, a lobbying group, or some random anonymous internet user surfacing that problem so long as it's verifiably true.
Intention factors into evaluation of the speaker, not the content. To do otherwise is to discard important, actionable facts because you dislike the speaker. In other words, it discards logic.
Comparing social media sites incidents and take down polcies for these types of sites are not meaningless.
If someone emailed you some material should your emailbox, you or all email be taken away from society? The reality of user generated content being uploaded and breaking a sites tos is all too common.
No, instead we has simply proven the Horseshoe theory of politics.
Liberty, and freedom are under assault from both the extreme right, and the extreme left. Who it seems end up supporting similar policies for very different reasons
Kristof's end goal (at least as he states, and based on the reporting) is not to shut down Pornhub, and he tried to balance it with talking to Pornstars to make sure there would be a balance.
I think you are attacking the person instead of the issue. It doesn't matter what her background is as long as there are legitimate issues. And that was certainly the case with backpage, and seems like it was true for Pornhub as well.
I would say Pornhub either is more serious or learned the lesson from backpage, and is moving very fast in shutting illegal things down. I don't think it currently looks like the outcome right now looks like they will be declared illegal.
It is hit piece in the sense, it focused a lot on PornHub while arguably the other players are worse in terms of the standards the article stated this as well.
Was it well deserved ? Certainly, they could have been doing more lot earlier. However there are lot worse players and the focus has been a lot more on Pornhub rather than industry in general.
Self regulation by Pornhub is good step forward, but a model which enables only change from one player after pressure from the payment providers is not reliable model. There has to be regulatory change to really protect the victims.
Jon Ronson's podcast "The Butterfly Effect" also does a great job of breaking down how damaging MindGeek's practices have been to workers and legitimate players in the industry:
You seem to admit that the accusations are valid, but are also claiming that this type of reporting is a hit piece. How do you square those two?
Also Reddit has been on the receiving end of this reporting before. That has often resulted in incremental improvements as the worst offending subreddits are banned (there used to be an /r/jailbait for example), but their model often doesn't kill that content or the demand for it and can result in that same content being dispersed among smaller and more niche communities that are able to glide under the radar. Reddit has still thrived throughout all that controversy so I don't think it is as big an existential threat as you claim.
The entire comment borders on nonsensical. Something about how board members should watch videos of alleged sexual abuse, some of it involving minors, before their companies make decisions that could impact PornHub? What??
I think that is an uncharitable interpretation of that comment. I read it as OP saying that those people likely watched unverified content that is now down, not necessarily illegal content. It is basically an accusation of puritanical hypocrisy. However the problem with that accusation is that it is extremely difficult to distinguish the unverified and illegal content, especially at scale, and it is why Pornhub went with this extreme measure to remove all unverified content.
I might be missing something here, but if that kind of stuff really is on Reddit then I would want them to be made to take responsibility for having it on their site and suffer the consequences if it is not removed and reported to the authorities.
Genuine question: how do you scale content moderation to millions of posts without hiring millions of workers at astronomical cost (who can make mistakes, like we’ve seen with DMCA take down notices, YouTube censorship when not justified, AppStore censorship,even GitHub made these mistakes, etc... some of these mistakes ruin peoples’ livelihoods).
Is there a “right way” to do this that protects everyone?
If a business can't run profitably and legally at scale, then that business model is a bad fit for that scale and shouldn't exist at that scale (presuming the laws in question are reasonable, which I think 'banning child porn' qualifies as.)
It's super weird that people seem to constantly start from the premise of, "But their business couldn't possibly [comply with law] profitably at scale! Oh well. I guess until someone comes up with a feasible solution, we'll just have to exempt them from being responsible!"
Imagine if pharma companies went "gosh, but hiring quality inspectors to make sure all of our drugs are untainted just doesn't scale!" I have trouble imagining you saying, "Well, they obviously can't sacrifice scale, so let's just wait and see if a better quality inspection solution comes up."
> Imagine if pharma companies went "gosh, but hiring quality inspectors to make sure all of our drugs are untainted just doesn't scale!" I have trouble imagining you saying, "Well, they obviously can't sacrifice scale, so let's just wait and see if a better quality inspection solution comes up."
Every drug has acceptable levels of impurities. There is a non-zero acceptable level of rat feces in your food. Your bank rounds the value of your account to some decimal place. The various parts of your car have some mean time between failures.
In the real world we place bounds on how much bad stuff is acceptable, and these bounds are determined mostly by practicality and economics. "Ban all childporn" is certainly a reasonable philosophy, but it is no better a law than "ban all impurities" which would shut down every drug manufacturer in the world. A real law takes the form "perform these steps or reasonable equivalents to limit X to a tolerable level." You can claim that reddit is not taking a step it is required to by law, but it's not breaking the law just because it hasn't developed a perfect solution to a problem that likely can only be solved approximately.
The only way this comparison to drug companies makes sense is if the drug companies had community sourced supply lines and were selling drugs a third party contributor had made in their bathtub alongside their normal drugs.
> "Ban all childporn" is certainly a reasonable philosophy, but it is no better a law than "ban all impurities" which would shut down every drug manufacturer in the world.
This would be a reasonable argument for a company like YouTube which has spent tens (hundreds?) of millions of dollars putting controls in place. If PornHub had iron tight controls and one of the actors was caught using a fake ID, then it might be reasonable. But they didn't have iron tight controls... or apparently any controls. There was zero verification that participants had signed releases or were of age.
PornHub's controls were complete shit.
Maybe some people want drugs from Pfizer's "Community Sourced" drug program... I sure don't.
This is an extremely weak argument. At a certain scale, policy becomes about statistics rather than absolutes. This applies to absolutely everything, including the laws themselves.
Companies in all industries have to occasionally issue recalls on dangerous or faulty products, just like PornHub or Reddit have to occasionally delete stolen or illegal content.
Terrible example. What if that was true? You think that we would just shut down the companies that make essential drugs rather than change the regulation?
People are going to watch pirated porn. If they can't find it on American sites they'll go to sites that aren't effected by American laws/regulations/journalism. That would probably make trafficking worse as the new site has no reason to take down any videos as their value add is refusal to censor.
What you're effectively saying - just like all arguments of this form, which seems to be quite popular lately - is that ordinary people should not be able to share their own content that they create online, because it's too hard to 100% police everything that they do. (Presumably including sending things directly and privately to people that they know, given how much better a channel that is for child porn than somewhere like Pornhub.) That the internet should be a consume-only place for the output of big corporate content factories whose content can be verified. Except framed in such a way as to make this seem like an anti-corporate stance, rather than the massive pro-corporate grab it is.
You do realize the success of the platforms is the result of finding legal loopholes.
That's the innovation. What if we are a media publisher but than without worker rights and unions? Youtube.
What if we start a hotel franchise but without any regulation or oversight or liability? Airbnb.
What if we start a cab company but make all our employees independent contractors, ignore all regulations and put all liability and legal risk on the independent contractors?
Silicon Valley is nothing but a bunch of young spoiled boys asking themselves the most American business question of all: have you tried slavery yet?
Business model innovations where you make money while someone pays all the costs.
>Is there a “right way” to do this that protects everyone
There is no investment if you do that, because the competitive advantage comes from the stealing. Nobody is investing in technology. Nobody is doing actual R&D. It's all about the viability of the scam. How profitable is this loophole?
We go out of our way to engineer protocols and technology and platforms that make it impossible to do that the right thing (by design). From p2p networks in the 90ties to crypto currencies. From YouTube to Spotify.
It's a dog eat dog world and I'm not here judging any individual. But let's not be naive.
Theft is the plan. Abuse is the strategy. Fortunes are not built on sweat. The people who work hard are the ones you hire to clean our house. The ones that pick our vegetables.
But we at least show the respect to others to not lie about the true nature of the successes of our industry.
>> We go out of our way to engineer protocols and technology and platforms that make it impossible to do that the right thing (by design). From p2p networks in the 90ties to crypto currencies. From YouTube to Spotify.
I don't understand this bit. So are you saying crypto currencies are making it impossible to do the right thing?
>So are you saying crypto currencies are making it impossible to do the right thing?
Cryptocurrencies are designed to subvert regulation and oversight. That's the selling point. That current societal infrastructure to protect innocent (sometimes gullible) people from everything from weapons trade or pyramid schemes.
If you want to 'make it do the right thing', thats already exists and is your wallet. Thats called money. A ledger with a tax identity. A bank account that requires a legal id. There is no point in burning all this electricity to verify those hashes, except to circumvent regulation and law.
I can't start a pyramid scheme and steal money from gullable idiots. Thankfully, an ICO is completely legal (for now). If you want to sell bread to hungry people, use money. If you want to sell guns to a hitman, use a cryptocurrency. Thats why cryptocurrencies are economically viable compared to money. Because they make it harder to apply the laws we voted on democratically, like how you can't sell children for sex.
The reason any money out there is investing into cryptocurrency technology, is because of the expectation of moral subserversion of society. Its a not a side effect. Its the bussiness model.
What do you think Venture Capital considers some startups worth so much money? Because they see the potential of price dumping and becoming a monopolist. If markets were actually properly regulated to always keep competition alive, people wouldn't be investing millions into apps and websites, they would be investing in actual R&D and hard science.
But doing the real work, can never compete, with just plain cheating. You always make more money being the bad guy. Even if a lot of startups have sincere intentions (Facebooks' mission was to connect people, not destroy democracy), the money will chase the decision down.
I think this is a point that tech folks like to sweep under the rug all too often.
All of these sound wrong:
- Walmarts are huge! It's impossible to build fire suppression at that scale.
- Ford builds an insane number of cars every year, it's not possible to ensure each car is safe at that scale.
- The United States burns 20,000 barrels of oil PER DAY. It's simply not possible to ensure a "safe" product at that scale.
- This buffet serves 20,000 plates per day, it's impossible to ensure food safety at that scale.
Each of those problems has been ~solved. Or those business wouldn't be operating. You only need to moderate a million messages if your customers are posting that many. If you can't afford to hire moderators to handle those million messages maybe it's your business model that's the problem, not moderation.
Fire suppression systems fail, cars get recalled or malfunction, people get food poisoning in restaurants or from food bought at a supermarket. Any product, not just tech, runs into problems when manufactured at scale. Tech is more visible because it is newer and in the spotlight. Nobody says "stop manufacturing cars" when one manufacturer issues a safety recall because cars have been around so long. The entire premise of producing 100X fewer cars so we can hand-craft cars and hand-check them is ridiculous - we simply accept the occasional manufacturing defect or product recall as the rates at which they happen are so small. These are all rounding errors, and the benefits of scale far outweigh the cons.
The same applies to tech. Content moderation has been ~solved as well. The overwhelming majority of content on social media platforms, websites, etc is regular, legal content. The illegal content that makes it through the filters and stays up for a significant period of time is a rounding error.
> I strongly disagree. Pornhub just purged 10 million videos because they couldn't moderate out child porn and rape videos.
PornHub is not a good example here. Reactive content moderation doesn't work for porn, because it's too hard to distinguish a positive from a negative. Content moderation for porn has probably been solved...by using a per-clearance model coupled with strong identity verification.
> There's terrible content all over Facebook
That's probably true, but the worst appears to be much harder to find. It's a lot easier to detect and ban all porn (especially if you can ban all nudity) than just certain heinous kinds of porn specifically.
Facebook seems to have more problems with trickier things, like hate speech, holocaust denial, and mis/disinformation. We have pretty good technology now for recognizing things in images, but we don't nearly technology that good for understanding the meaning of speech and text.
This is the error. The problem isn't scale. If YouTube has 500 hours of video uploaded per minute and some other site has only one minute of video uploaded per minute, the smaller site is no better able to handle it, because they don't have proportionally any more staff than YouTube.
The actual problem is marginal cost.
If you have a Walmart store, it probably costs over a million dollars to construct it, and takes in several million dollars a year in revenue. A $10,000 fire suppression system may be expensive, but it's manageable. And if there are a thousand Walmart stores all over the country that each need a $10,000 fire suppression system, that scales just fine. You have a choice between an unsafe Walmart and a safe Walmart whose proprietors make slightly less profit.
The marginal profit from hosting a given video is pennies. That means any solution that costs more than pennies isn't a choice between unsafe and safe video hosting, it's a choice between having video hosting and not having it. Scale has nothing to do with it; greater or lesser scale doesn't save you from a cost that exceeds your entire margin.
> Is there a “right way” to do this that protects everyone?
You do it the way the porn industry has for years. You get consenting actors. You get releases from the participants. You keep track of who is in the videos.
If you are hosting third party content, you verify they do the above.
This isn't like hosting a Disney movie on YouTube. If you screw this up, there are real victims here. There should be some weight and repercussions to posting child porn, revenge porn, or any kind of porn where some of the actors have not signed off on it.
Not saying it is a "right" way, but this video of the ActivityPub conference contrasts content moderation of the big centralized platforms to how moderation is conducted on the decentralized fediverse (e.g. Mastodon, Pleroma, PeerTube, etc).
Spoiler: where the latter makes it way more manageable, and for the former it is near undoable.
i don't think reddit is hosting underage porn, and im sure there is some subreddit somewhere that has questionable content (the same as there is copyrighted material somewhere deep on youtube) but i don't think that it is so widespread.
Minors are 100% uploading to popular porn subreddits. There might not be a specific r/jailbait any more(and there probably is), but it's definetely still happening. They just have "18" in the title now.
Every platform that grows large/ is successful will eventually "hate it's users" as it needs to be profitable. Currently that means, at least for the solved version, invading privacy/showing ads/selling user information.
I believe the point they are trying to make is Reddit is tying access to pornography to installing the app, thus making an explicit connection between porn and their app, while Apple bans apps featuring adult content from the app store completely.
That ban has been leaky for a while -- HBO & Cinemax have carried pornographic content for decades, yet both have apps in the App Store. On top of that, you could buy HBO & Cinemax directly from Apple before the launch of HBO MAX earlier this year.
Cinemax at least shows content that is softcore erotica by any definition. Though from what my [dead] sibling comment says, maybe they no longer have this sort of programming.
From looking over their Wikipedia entry, it looks like Cinemax has backed away from adult content, but the last time I was a subscriber (around 2016/2017), their app had a less-than-obvious "late night" section buried in its navigation.
Yes, unfortunately reddit pays engineers to obfuscate use cases so they can push notifications and steal your attention. The mobile app didn't even have (lasi I checked) a (n intuitive) way to get to r/friends short of actually clicking on a link to that feed a user has posted. The lights are on but it seems no one is home
> I'm surprised at how much impact the NYT's expose (I would personally consider it more of a hit piece) has had. I naively assumed that PornHub, as a top-10 site on the web would be immune to such a piece, but I guess I was wrong.
The NYT is among a small subset of papers that is read by elites and captains of industry. Most of the people at the top of large credit card companies heard this from the NYT, not from customer complaints or lower-level employees, and they immediately thought about the PR implications.
> How many of the members of the NYT and the Visa and Mastercard boards who created on this outcome watched the videos that are now down? Probably a majority of them, but that wasn't enough to stop the moral panic I guess.
Who knows? But that's not why they are cutting off payments. They are cutting off payments because associating with a pornography site that was called out in an NYT op-ed is a bad look.
Reddit's strategy is to shift blame to the moderators of the subreddit in question. This is really their biggest innovation, delegating the burden of moderation to the users. Then the Reddit administrators effectively need to manage a much narrower set of users. They don't need to manage the hundreds of thousands of users in /r/news, just the few active moderators.
If the NYT or another big outlet publishes a hit pieces, Reddit will just ban or quarantine that particular subreddit. We already saw this play out with The_Donald and a few other subreddits (one on hating fat people, and one with a derogatory slur based on racoons come to mind).
In the case of criminal liability, that won't work.
That's almost the first lesson you learn as a first year back in law school. You can't contract away criminal liability. If there were child porn found on a sub-reddit for instance, the executives are liable. So you have to work overtime to make sure that stuff isn't on your servers. Or if it is, that you are reporting it in a timely fashion. A situation where a video with 10 views that was on your server for 1 day before you reported to law enforcement will be looked at very differently than dozens of videos, with thousands of views, that were all on your site for an average of 3 months before being reported to law enforcement.
No, this is why section 230 exists. Sites are not liable for user generated content. Nobody would ever, ever risk of assuming liability for all of the of millions or even billions of users on a web platform.
Illegal content is on Facebook, Twitter, Reddit. The question is if and when it is removed. There are people accusing all of those sites of irresponsibility when it comes to moderating. But roughly, there seems to be no strong evidence that Pornhub did not remove illegal content they became aware of.
More generally, the question is if a porn-site that allows anyone to upload stuff should be legally viable. You may feel that it should not. But make no mistake, Laila Mickelwait wants to shutdown the porn industry, not improve Pornhub.
This is clearly false, they've claimed to be using a lot of technologies to preventively detect these issues. Visa has threatened to destroy not only Pornhub, but the whole parent company now. They'll just shut down Pornhub before destroying their company. Looking forward to the next russian-hosted site. It will surely be better, right?
> How many of the members of the NYT and the Visa and Mastercard boards who created on this outcome watched the videos that are now down?
Probably fewer than have financial interests in Facebook. Probably the most damaging thing PornHub did in response was pointing to the nearly 6 orders of magnitude difference in instances of child sex abuse between themselves and Facebook. Had they not been willing to point out the hypocrisy and how much worse a far more mainstream (and more American) company like Facebook is, they wouldn't need to be destroyed, just mildly symbolically punished as a distraction from the real problem areas.
I think a lot of readers and visitors to Pornhub assumed that Pornhub was reputable (as reputable as a porn site could be), and were shocked by the expose. They could also have legitimate reason to worry that they might be culpable for inadvertently viewing illegal material, and would be angry at Pornhub for exposing them to that risk.
And lets' be frank. Most people are pretty uncomfortable about talking about their porn habits. Talking knowledgeably about what is on the site reveals them as a visitor. This is something that is difficult for people to discuss, and will only do so from the most defensible position possible.
I am angry at Pornhub too. Stolen content is one thing. But rape and pedophilia is another.
> They're just an opinion columnist's penstroke away from the mob coming to roost.
If Reddit is not doing anything about content with underaged actors, rape, and non-consenting participants, then they should face scrutiny.
If someone opened a video store in our city with videos of people's underaged kids, portraying rape, or using footage released without consent. You can bet the local PD would be knocking on their door eventually. There is no reason a corporation should be exempt here.
> If someone opened a video store in our city with videos of people's underaged kids
Comparing to Reddit though, I think it's fairer to say something like 'if someone opened a cafe where paedophiles liked to hang out and chat'.
Yes you'd absolutely expect police might visit in plain clothes to try to eavesdrop, or even raid the cafe, or whatever, but generally assuming there's no reason to suspect the cafe owner's involvement they'd do so minimising disruption to the business so far as reasonable/practical.
> Comparing to Reddit though, I think it's fairer to say something like 'if someone opened a cafe where paedophiles liked to hang out and chat'.
The difference here is Reddit content is persistent and public. If you are in a cafe, your conversations are transient and can only be overheard by your nearest neighbor. Your cafe would have to have a bulletin board or file cabinets where descriptions of content is posted along with where you can get it.
I'm pretty sure that wouldn't last too long either.
Hmm.. Or a 'graffiti wall' where customers are allowed to contribute to the cafes eclectic art, but it's not exactly curated?
What I'm getting at is Reddit wasn't started with the idea that it had to vet content, if you put the responsibility for content on the website owner/operator.. I'm not saying it's wrong, but I think it's hard to draw a parallel on the high street, just because of the scale of the internet I suppose.
A debate society perhaps, that keeps rigorous records - in parliament for example, if a Member said something egregious they'd be personally called out on it, apologies (or resignation) demanded, etc. - but it wouldn't be struck from record. People might also think the Speaker ought to have stepped in sooner, etc., but then I suppose that's the problem again - there is and always has been the expectation of moderation there, the record keeping isn't some passive web form -> database, it's human presence.
(I'm honestly not sure where I stand on it, I'm not arguing one way or the other; just thinking - and putting those thoughts into a passive web form -> database that has only some light retrospective moderation. ;))
> What I'm getting at is Reddit wasn't started with the idea that it had to vet content, if you put the responsibility for content on the website owner/operator.. I'm not saying it's wrong, but I think it's hard to draw a parallel on the high street, just because of the scale of the internet I suppose.
Scale seems to be the recurring theme of defense of illicit content.
I think that's backwards. The scale of these platforms means they have an obligation to be more vigilant.
In the spirit of the internet being past its peak and rapidly getting worse, we are now entering the second serious dark age in human history where the amount of porn available is shrinking. The first was the porn ban of tumblr, where one could find specialized tumblrs with archives spanning thousand of pictures as narrow as for „blonde guys with visible legs on beaches“ or „redhead girls dressed up as clowns sitting on balls“. Reddit is a weak replacement. Who knows how long they will allow porn?
Yes, yes, I know: Won‘t anybody think of the children!
So the question is - why don't Visa and Mastercard take issue with those other platforms? Are they going after Pornhub simply because it sounds "bad"? Is it because they are caving to pressure from religious groups or feminist groups or some other organization? Or is it because other offenders may espouse political views that Visa/Mastercard are aligned to? Or is it that Pornhub is smaller and they can afford to lose them? Either way, I am not a fan of duopolists picking winners and losers, instead of the law picking them.
It would help to reproduce the claims in question so we don't have to read the comments there. It looks like you're only seeing the Internet Watch Foundation statistic, which has been covered elsewhere in this thread.
Pretty silly what they have to do in order to make the keyholders happy. Should MC and Visa really dictate how people make their business? I don't think so.
Sure CP is very, very bad but I have been using Pornhub for many years and I have never seen such content on the platform. This is one big reason why I think decentralized currencies like bitcoin one day could succeed.
IMO, this is the golden quote:
> Pornhub's announcement also cites a report by third-party Internet Watch Foundation, which found 118 instances of child sexual abuse material on Pornhub in the last three years, and notes that in the same period, Facebook's own transparency report found 84 million instances of child sexual abuse material on the social media platform.
In despite the nature of the content on Pornhub and the amount uploaded there is very few instances of CP on the platform which I think speaks to their good work in preventing it.
Now I don't know about piracy but that also seems like something that is.. uncommon?
Presumably you may have seen CP and just not been aware of it. Once you reach a certain point, you can't reliably tell someone's age by looking, even when you can literally see every part of them.
As illustrated in porn by the famous case of Lupe Fuentes, and at the opposite end, Traci Lords.
Third-party foundations finding legitimate instances of sexual abuse of children is going to be incredibly rare because how would you ever tell 18 and 17-year-olds apart? Facebook has much more information available to them and they are aggressively checking their platform for it, that's why they find so many instances.
Also Facebook is much much bigger than PornHub. This cannot be stated enough.
For many people Facebook is the internet. Facebook handles commercial transactions, conversations between family members and is the sole source of contact between old friends.
Other than Facebook, I'd guess the biggest source/storage spot for CP would be Gmail.
Unlike PornHub, Facebook has to contend with actual children sending pictures back and forth to each other, just like our generation sexted on AIM and IRC. Last of all, PornHub content is mostly public so people will flag content and take cause it to be taken down fairly quickly, while Facebook can have content hidden away for ages inside some private chat or private group.
So the recap:
1. Facebook is bigger
2. Facebook has more source material (i.e. under 18 year olds use the platform)
3. Facebook is has less crowd-source moderation (so illegal content builds up like sediment)
That helps for the cases where someone is abusing a child and the police were involved. Does it help when a 15/16/17 year old uploads their own pictures/videos without anyone knowing their actual age? Or when someone's boyfriend/girlfriend records them and uploads it without the victim knowing?
Like Youtube's filter will catch me uploading a video with a Taylor Swift song in it but not someone uploading a video using one of my independently produced/distributed songs, because their filter doesn't have my music in it.
Because viewing/storing this material is illegal for everyone except the minor in question. If a minor posts a nude picture on Reddit and I view/save it, I'm guilty of possession of child pornography. Maybe I could get the charges dropped if I can argue/prove I didn't know they were underage, but lives are ruined over mere allegations of child sexual misconduct and that reputation would follow me everywhere.
>There's really no way to prevent this but to abandon privacy/force verification.
Which is why official producers of pornography are required by law to verify age and consent of all of their actors (18 U.S.C. § 2257). The problem is when they allow producers to distribute pornographic materials on their platform without verification that the producer is following the law, like PornHub and Reddit and Imgur and Tumblr etc. Forcing verification is already in the law for producing pornographic materials.
>It's not a problem exclusive to pornhub
Correct, but the fact that other companies are violating the law does not absolve PornHub of their responsibilities.
I understand the legality, but imo the legality itself should be challenged.
If you legit can't tell something is illegal why can you be prosecuted for it? There was no intent to harm.
Also why is it illegal? I understand child exploitation or if someone is taking advantage/not considered old enough to make that decision, but I've never really understood why it's illegal for 17 year olds to distribute nudes.
And even then, would you rather have a few false positives or more freedom and privacy?
Is 17 year old a "legitimate instance of sexual abuse of children" while 18 year old is "perfectly okay"?
I don't think such a strong border makes sense. People grow up gradually, they don't magically transform from irresponsible children to responsible adults the very day they reach their 18th birthday.
I would be interested in having those numbers * view count. Facebook will probably still come out on top, but I expect the difference to narrow. I‘m not sure if that makes it much more harmful but important for the comparison.
Payment processors that large should be considered utilities at this point.
They are not official police but they do some sort of moral police to the businesses they don't like. Their antics and policies are mostly unclear and seem random.
Today they block a porn site tomorrow they block you - and what can you do?
Just like you cannot do business without a bank account, you cannot host a site without a payment processor. If the business is legal: then the processor should be forced to support it. If the business is illegal - then it is the prosecutors' and police job to close it.
As usual this censorship is explained "as made for the good of children".
The action of MasterCard here could be a blessing in disguise. To the extent that the major credit card platforms drop porn, alternative payment methods will become mainstream, and the credit cards will gradually cease to be gatekeepers.
The demand for porn won't be affected by such bans, fappers worldwide will interpret it as damage and route around it.
The fact that there are only two significant payment processors, and they act in lockstep to deny service to minority groups is a way bigger monopoly threat than Facebook owning both Instagram and WhatsApp. How Visa and MasterCard have escaped anti-trust scrutiny is beyond me.
I wonder how much original content has been lost, not that it's comparable to book burning on the humanities. Is there enough porn sites in the ecosystem to redundantly "archive" content. Certainly more enthusiastic users out there than any other subject. Pornhub parent runs a lot of websites and overlap of local laws and technical requirements means not many countries can support large online porn platforms. That said, I'm sure there are private "enthusiasts" who datahoards, 50 years from now we'll find abandoned racks of HDs and tape drives full of "vintage" porn.
"This means every piece of Pornhub content is from verified uploaders, a requirement that platforms like Facebook, Instagram, TikTok, YouTube, Snapchat and Twitter have yet to institute."
the entire world's solution for fixing the fucked up society seems to be moderation and not education.
Education is hard, has a really long lead time, no one can agree on what the curriculum should be, and you can't force people to learn anyway. You're right that we should be trying, but moderation will remain necessary forever.
Now that Visa/MasterCard have dropped them, how are they going to get paid? Are they going to have to implement ACH and equivalent around the world? Just switch to a different payment processors? Move to billing to an intl visa subsidiary? Appeal to Visa/MasterCard now that they have policies in place? Or does advertising cover their costs and charging users was just an add-on?
What I find crazy is that you can't just start your own payment processor. That space is gatekeeped to the extreme and it is what makes me uneasy about the whole electronic money thing. If we want to part with paper money, we should be allowed to issue our own cards tied to our own money reserves.
No one's stopping you issuing your own card. The barrier is in getting your card accepted by zillions of retailers and online merchants. Having built that network is what entrenches Visa and MC.
PayPal has attacked this in the online space by offering merchants a credit card checkout flow that also allows you to pay straight from your linked-directly-to-bank-account PayPal account. Square is attacking it in a similar way in the physical domain. Companies like WeChat and Grab are also working on alternative ways to pay for common things.
It's definitely a duopoly, but the solution here probably looks like some kind of government-overseen/mandated standard which allows a reasonable hurdle for newcomers to participate on both the consumer and merchant side.
> The barrier is in getting your card accepted by zillions of retailers and online merchants. Having built that network is what entrenches Visa and MC.
Open Banking is interesting in that regard, in that it allows for account to account transactions.
The amount of government regulation has led to this payment processing duopoly. Not saying it's a bad thing, just that it's the major reason of how we got here.
Most of what I'm aware of in terms of banking/CC regulations is oriented around consumer protection (without which you don't have acceptance of the system). What are the regulations which have reinforced the duopoly? (honest question)
The EU should require these operators to extract the infrastructure business part and then that new company should have to be required to accept any payment provider that meets legal requirements on the same terms as any other provider (including VISA or MC). It is the only way to bring free market to that space (someone may take that as an irony, but true free market capitalism doesn't exist, exactly for the reasons like this)
I'm not against extreme regulation of payment processors. After all they are dealing with money and there is huge risks. But on other hand it should be open market and anyone who fulfills the regulatory burden should have equal rights to paid access to existing players like banks. And I don't even think it should be cheap access, but access that can't be revoked.
My first thought was to disagree with you and say they're a private company and they can do what they want with their platform, but you're right. There is hardly any competition in that space, MasterCard and Visa are monopolies
I wonder if it might has something to do with wirecard is gone. They should have processed a lot of money for porn sites, because no other bancs would work with them. So now after wirecard is gone, I guess these companys have to play by new rules.
The web is absolutely full of “don't ask; don't tell” child pornography.
Spaces such as Reddit's r/gonewild are no doubt also full of technical child pornography, but who is to tell? 4chan has a strong stance against clear child pornography and has lead to many an arrest of those that attempt to post that, but I very often see images posted that certainly plausibly could be, but there is no way to tell and age estimation is not that reliable.
I don't feel strongly either way. I think problems such as revenge porn will never actually be solved unless you improve society from the ground up rather than adding some rule to a website but I'm not against the rule either.
I hope this place is okay to ask this question: is this move means all the videos now disabled can not access it anymore?! All purged to oblivion forever? This move will destroy this platform, no doubt!
I'm glad they're doing it, but it kind of feels like we've been in this cycle for a while now.
I mean, the price of freedom is constant diligence, so I guess that's to be expected. But I think the ultimate issue with all of these content hosting sites is that almost no real effort has been put into making moderation a community wide responsibility (and also difficult to abuse).
I really feel like improvements to moderation systems is the thing that keeps failing to happen and eventually kills every form of social media.
I read most of the Nickolas Kristoff editorial in the NY Times (don't confuse editorial with actual reporting, in the NYT or anywhere - editorial has 'license' to do hit pieces). While I agree that online porn is a horror show, Kristoff oddly omits what looks like the biggest problem to me:
(This might be upsetting. Read at your own risk.)
What I've seen of online porn sites is a strong emphasis on the real physical and sexual abuse of women. It's advertised brazenly in video titles: I am confident that if you search for something as horrible as 'woman gets the crap beaten out of her', you'd find lots of videos, and I'm not using more stomach-turning and sexual examples that I'm confident would find equal success. It's shown in the videos (at least the still frames that are part of the video title): Women expressing pain and showing bruises and even bleeding.
I understand sexual fantasy, and that to widely varying degrees and frequencies people enjoy these fantasies - PornHub has many customers - and that a few people even consensually engage in them in a safe, controlled manner. That is all people's private business, not mine or the public's, and I have no objection to it. But these are not fantasies - these are real people, real human beings, getting hurt and abused, physically and emotionally, for others' entertainment. It's on an enormous scale.
We can argue to ourselves that they are actors and it's all made up, like a Hollywood movie. We can say the women choose to do these things. And I will tell you what I told myself: We are full of shit and we know it.
Almost everything we see and read tells us otherwise, and yes many of these people are assaulted on camera by any definition of the word; many are beaten and tortured. Also, if you think Amazon workers are vulnerable, imagine the situation of these women. Are they going to take the video producers to court? Call the National Labor Relations Board and file a complaint? Has it ever happened? If it has, it's a drop in the ocean of porn online. From what I understand (and see), these are among some of the most vulnerable people in communities with limited resources. The world and the system tell them (wrongly): You're work in porn, you are the lowest of the low, we don't care. Imagine Amazon treating employees this way - and then selling the videos for profit.
I don't want to speak for the people in the videos or define them; I encourage you to find online them speaking for themselves. They are all different people with different responses to different experiences, but a lot of it is much worse than this HN comment. Also, imagine seeing someone you know or love being hurt like that; imagine how you would feel if it was you, beyond the pain. These are people just like us and our loved ones; they feel the same things.
It's straightforward: It should be illegal to hurt people, and to distribute videos of it, and to do it for profit. It's not hard to understand. It is happening on a wide scale and publicly. It's a horror show, and it's real.
I’d like to see some kind of platform penalties that are redistributed toward victims. Pornhub has made millions off things like pirated sex work and revenge uploads, while wiping out legitimate players in the industry and driving down wages and working conditions [0]
Now, having wrought all that destruction, they’re returning to what? Exactly what a regulated, verified industry looks like. It’s unconscionable that we’ve built an economy that equates to “do a lot of crime until you’re a monopoly and then behave as you should have in the first place.”
That's pretty much the MO of a lot of SV companies. I remember when Uber came to my city and there were news articles about police officers ordering a ride and ticketing the driver afterwards because unlicensed taxis were against the law. The city claims Uber never petitioned them to change the law, Uber just knowingly broke the law and paid the fines on behalf of their drivers until the city changed the law. Airbnb has done similar things in the past in cities where subleasing and short-term rentals are regulated, but they skipped the regulation and just broke the law until the law changed.
Youtube does the same, Facebook does the same. Laws and regulations are seen as things to disrupt rather than rules to follow or petition for changing. And if you have a billion dollars of VC money in your pocket, who cares about a $100 ticket?
If the government changed the law rather than stepping up enforcement, that means that public support was not behind the law. This is essentially civil disobedience, and I say that is a completely legitimate strategy.
Neither has for myspace. Until facebook. And now facebook is losing people (luckily for them, a lot have gone to their own instagram). Snapchat was all the rage for a year or two... and now... pretty much silent.
Youtube just has a high barrier of entry.... a few more "optimizations", screw up a few more content creator, and a new billion dollar startup will come, and take over where youtube stopped.
To my memory, myspace got left for Facebook because FB provided a utility more along the lines of what people were searching for. I don't remember the FB exodus being driven by a MySpace content clampdown.
Joseph Kennedy did not make his fortune from bootlegging. That's an urban legend, presumably begun by contemporary detractors. He did make some moves in the liquor business when it became evident prohibition was about to end, but that wasn't where most of his money came from either. He was an investor in stocks, commodities, real estate, and movie studios.
One of the many benefits I have received from porn and sex positivity in general. Imagine thinking you're supposed to "go after" a woman because of some requirement to start a family? Absolutely dystopian.
The OP didn't phrase this elegantly for sure, but the general idea that a life spent alone, consuming pornography all day is unfulfilling should not be controversial, and porn like any addiction / vice can become a trap.
> the general idea that a life spent alone, consuming pornography all day is unfulfilling should not be controversial
1). I don't see why that's not controversial. It doesn't seem self-evident. If a person can fulfill the need by watching porn, then by definition, it is fulfilled. If a person finds it unfulfilling, well, such a person will be motivated to branch out and start doing something else.
2). OP claimed that regardless of whether you watch it all day for a few times a week, porn hinders you down from pursuing females. I'd argue that the latter doesn't.
3). Not forming a family doesn't mean a life spent alone. The only people I know who's determined to not ever get married, has a very active sex life and parties all the time. The idea that one can't be happy without a family is fairly outdated.
4). Addiction is a real thing, but one has to recognize that there is a wide gap between pathologically addicted to porn, and just enjoying watching porn.
> If a person can fulfill the need by watching porn, then by definition, it is fulfilled. If a person finds it unfulfilling, well, such a person will be motivated to branch out and start doing something else.
Depends on what we are "fulfilling". Just think about an extremely overweight person who cannot stop overeating. Yes, they are fulfilled in terms of food, but (sometimes) they are self-loathing and consider the fact that they cannot stop eating a personal failure, so in that sense they are the farthest from "fulfilled".
Same is true for porn, you might be fulfilled in a sense that you get to orgasm in 10 minutes, but there are other facets of life where regular porn consumption or masturbation might have a negative effect.
Sure, nobody would expect porn to fill desires other than sexual. But, one can do other things to fulfill other desires.
> there are other facets of life where regular porn consumption or masturbation might have a negative effect
If there is a pathological addiction, yes, it will have a negative effect on one's life, just like most other addictions. (Addiction to sex, for instance)
But, otherwise, I can't really think of masturbation interfering with life that much. People frequently waste away 10 minutes in all kinds of ways.
Perhaps we could let a man decide for himself what he might wish to do with his life.
For me, it's neither pornography nor “going after women”, a practice that thankfully isn't common in my culture, I find having debates on the internet and consuming quality fiction to be more rewarding than either.
You wouldn't even be here if your dad didn't go after your mom. What's dystopian about the drive to perpetuate the species? That's the most basic function of any lifeform.
I am extremely driven, but due to various methods of birth control I am completely unsuccessful.
Many species, including higher mammals, frequently engage in infanticide as part of their quest to perpetuate their genes, so maybe let's all try to do a bit better than the default of the natural world.
> Many species, including higher mammals, frequently engage in infanticide as part of their quest to perpetuate their genes, so maybe let's all try to do a bit better than the default of the natural world
Infanticide is conducted routinely in human societies in the form of abortion. Even though, abortion IS the right choice.
The result is not the same either. The described infanticide serves a goal of undoing other gene’s reproduction. Abortion is a smarter form of not spending resources on a weak/dead-end offspring (or just throwing it out of nest), if you want these analogies.
I’m not talking morals here and not arguing for or against abortion. Our morals have nothing to do with competition infanticide in mammals vs abortion diff.
Is it wise to stretch other species’ defaults to this issue? It is our behavior that is discussed, not that of some “higher mammals”. If that was natural to us, it would be a known widespread phenomenon in “feral” areas. Nothing adds more tension to the society than depriving most fundamental drives. Thank god these do not include opposing each other by forming large groups or not giving respect or food to someone because they don’t look better than us.
We certainly are not. The thing about Malthus is that he was wrong.
Birth rates are below or barely above replacement level in most of the world anyway, the only place with high birth rates is Africa and they’re not listening to you.
Porn is like the junk-food of sex. McDonald's french fries are engineered to disintegrate on the tongue and deliver that hit of salt and fat. It hijacks our evolved taste buds in a way that's pleasurable - but not healthy.
Porn is like that but with sex. A little bit won't hurt you, but too much can ruin your sexual health and distort your ideas of what sex should be.
And who is the arbiter of "what sex should be"? Perhaps porn is providing the much needed service of expanding our definitions of sex, which had been brutally restrained by prudish mores for most of Western civilization.
That sex should look more like porn is quite a dystopian future for women. I think expanding our definitions of sex is a good thing, but that doesn't mean there should be no restraints either.
> And obviously no one is calling for "no restraints".
I've been with a couple of women I couldn't find limits of, and it's not for the lack of attempts at otherwise considered disgusting practices... and I know for a fact that I'm sexually pretty mild compared to what's out there.
What I'm saying is reevaluating bedroom norms and moors is fine, but let's also not make the mistake of thinking our prudish ancestors had it all wrong.
I think porn has both good and bad influences - but it is definitely not benign.
I couldn't disagree more. Our prudish ancestors had it all wrong. Ask the uncountable millions and millions of gay men and women who suffered their whole lives at the hands of those prudes. And what did the prudes gain in return? Bad sex and shame. There is nothing redeeming in an anti-sex perspective.
Whenever you find yourself saying someone is totally wrong, is a good indication you don't understand them.
I'm not going to argue with someone who can't see the other side. Presumably you can find some value in some of the values and traditions from the past? You have to acknowledge that our current values are merely an incremental change on what came before. You can't say they were all wrong without saying you're also wrong then.
None of that is a problem. Porn can be quite degrading to women and young women should not feel pressured that they have accept that because it's now more normalized because of porn.
> Should people feel pressured to accept normalized behaviour which used to be seen, or are still seen, negatively - such as homosexuality or trans?
I find that a bit of an odd question. I think people do currently feel societal pressure to be accepting of LGBTQ people. I don't think that's necessarily good or bad by itself - it just reflects the changes in society.
Should parents feel pressure to be accepting of their children's lifestyle if they come out as LGBTQ? Yes.
Should parents feel pressure to be accepting of their children's desire to transition if they identify as a different gender? No, because that's permanent, there's a good chance they'll later come to view that as a mistake. If they still want to do that as an adult - then that's when they can do it.
To try and get back to what I think you're digging at - should girls be pressured into sex acts in porn which used to be seen, or are still seen, negatively because they're more normalized now thanks to porn? - No. I don't think people should be pressured into sex acts.
Almost all male friends of mine who had admitted watching porn, are either in a relationship or married. I don’t know if they “went after women”, though; it doesn’t sound like the right way to pursue love.
Some admitted to watching porn even after marriage.
I don't know what porn are you watching, but there is not much about relationships (in fact, lot of porn is specifically about fantasies of being unfaithful).
And that's the main problem with OP's post. If you have unsatisfied sexual needs, it's awful to suggest that you should attempt at relationships (or even start a family) for that reason. No, people should do these things because they genuinely want to have them, and not to pretend they want them just for a bit of nudge nudge.
I am grateful for porn existing, because it lets me not to have a (sexual) relationship if I don't need the emotional component. I think it's more honest that way, actually.
TEDx is not exactly the same as TED. Just about anyone can organize a TEDx, and lots of cranks give "TEDx talks". There are dozens and dozens of "TEDxes":
Why are you so sure that the psychology of sex and violence are identical, especially in such a specific context? They're totally different things. It seems ridiculous to point to the studies on one and say, "surely the studies on the other shall have the same result". Personally, I have no idea, but your premise doesn't make sense.
It's much more likely than senseless violence is far worse for the psyche of an individual than senseless sex is. I'd rather 10 year old timmy see sexual content instead of terminator.
I'm most likely in the minority - but I'm also convinced that I'm living in a clown world where our society tolerates and encourages senseless pain and violence from the very beginning (through routine male circumcision of hundreds of millions as just a single example), so being in the minority makes me believe that my chances of acting ethically are higher, not lower.
Basically, porn is far less bad than violence from a very early age.
It's the psychology of human moral panics that's identical. If there's a new pastime, whether it's novels or Pokemon Go, people will get up in arms saying it's ruining the youth.
I'm saying people do not like it when I interact with women in the real world so in their view it would be a good thing if porn were to prevent me from doing so
There's individual freedom and there's aggregate outcomes. No person should be prevented from almost anything, but what's causing many people to behave in a certain way, and what are the likely outcomes?
We can be concerned about MGTOW and incel culture etc. without specifically wanting to force people to conform.
Conspiracy theory: there’s a lot of porn and an unhealthy amount of competition to monetize it all. If PH were simply trying to drastically reduce competition against standard studios by removing the amateur stuff, I’m not sure I could blame them.
There really isn’t much competition in porn. PH is actually a company called Mindgeek that has a monopoly on US produced porn; they bought all the studios and all the cam sites.
So you disagree with it morally but watch it anyway? And then don't even pay the people who produced it? Isn't that even worse?
I think there's nothing wrong with porn in general and the people who create it - just like other artists, performers, actors, etc - should be paid for it.
Just because it's "morally disagreeable" doesn't make it okay to pirate it. Then you've committed two "sins" instead of one.
I said as much in the first 7 words of my comment.
Porn is wrong. Stealing content is wrong. My feelings about porn aside, what I am doing is stealing by the law of the land. Do I feel bad about watching porn? Yes. Do I feel bad about stealing it? No.
I remembered an interview where the victim said "my 16 yr old younger brother tried to rape me while I was sleeping, when confronted he confessed he learned from pornhub" pornhub also needs to be responsible for such misguiding titles
But they still made money on videos with underage runaways for years, right? Is law enforcement going to do anything about this or do they only arrest people for not wearing masks, now?
Why? Haven't you seen all of the videos of people being arrested for it? Would you have liked it better if I said "speeding tickets". Fine, speeding tickets. Yet KNOWN peddlers of child porn rake in cash like most people can't fathom. Law enforcement is a cruel joke at this point. The system is just a perpetual motion system of wickedness at this point.
Porn is harmless? To make that claim shows either bad faith or profound ignorance. Porn is incredibly harmful and degrading, both to those who perform on it as well as those who watch it. Nevermind the child porn, sex trafficking, or the blatantly non-consensual stuff, porn itself is gravely immoral and incredibly destructive. It hurts a person’s ability to have a healthy relationship. It creates obsessions and addictions that are also difficult to break. It enslaves you to your passions and blinds you. It locks you within yourself. Porn is a totalitarian regime’s dream come true. Instead of prodding people with police batons, you drown them in isolated prurient occupations, Brave New World style. You want a revolution that increases your power (until it gets out of hand, anyway)? Begin with sexual depravity. Sell it as “liberation”. Reenact the Bacchae. A man has as many masters as he has vices, so multiply those vices in others to multiply your control over them. Give them rationalizations that keep them trapped in their vices and easy to manipulate. He who controls the passions controls the masses.
Thus a man is only free when he has mastery over his passions and lives according to the truth. For a man can either conform the truth to his passions (rationalization) or he can conform his passions to the truth. The first is a slave, the latter a free man.
FOH with your puritanical bullshit. There's no such thing as "sexual depravity". You don't get to define what is a vice and what isn't. Porn does not lead to a totalitarian regime. If anything, looking at history, the more authoritarian the state, the more it limits sexuality in the population.
You can have fun and also "live according to the truth", whatever that means.
I think the decline of moral argument and the rise of this general "anything goes" attitude has been a negative for society. We have to decide to draw a line somewhere. Even porn companies that feature of-age performers are highly exploitative and uncomfortably close to human trafficking - see the case of Girls Do Porn, and they're just the one that got caught doing it.
I don't think it's unreasonable to argue that some sexual behavior should be considered "bad" - voyeurism, selling yourself for money, extreme fringe interests/fetishes, etc. I don't buy the idea that Pornhub is just an innocent business trying to run their website. They have built an empire on human exploitation.
> Even porn companies that feature of-age performers are highly exploitative and uncomfortably close to human trafficking
Why?
> I don't think it's unreasonable to argue that some sexual behavior should be considered "bad" - voyeurism, selling yourself for money, extreme fringe interests/fetishes, etc. I don't buy the idea that Pornhub is just an innocent business trying to run their website. They have built an empire on human exploitation.
Ah, I see why, you simply consider it bad per sē that sex be treated as a service that can be sold for whatever price the market might dictate.
That's akin to saying “Construction work is highly exploitative and uncomfortably close to human trafficking — see the case of the Qatar football stadium.”
Obviously providing a single example where it went wrong does not amount to showing a universal criterion, which is what your wording suggested.
He said that there are "porn companies that feature of-age performers" that "are highly exploitative and uncomfortably close to human trafficking", e.g. GDP. Nothing more, nothing less.
> Qatar football stadium
The issue with documented human rights abuses in the construction industry in Qatar is a problem with Qatar, and not with the construction industry. I don't think I have to explain why.
> He said that there are "porn companies that feature of-age performers" that "are highly exploitative and uncomfortably close to human trafficking", e.g. GDP. Nothing more, nothing less.
No, he did not say that; he said “Even porn companies that feature of-age performers are highly exploitative and uncomfortably close to human trafficking”.
> The issue with documented human rights abuses in the construction industry in Qatar is a problem with Qatar, and not with the construction industry. I don't think I have to explain why.
And why do you believe that the specific instance referenced is a problem “with the pornography industry”, rather than one with that specific company?
Because anyone interested enough in this topic to do any cursory research -- as opposed to just being an insufferable fedora on the internet -- will realize it's not limited to just one company.
It's very easy and very free to oppose human trafficking in porn by not visiting the websites. Not to mention the personal gains from squashing the habit of seeing human bodies as objects to use for one's personal pleasure. It's significantly harder to oppose human trafficking in construction, what am I gonna do, not enter buildings? Comparing numbers is irrelevant.
I would go so far as to say that the reason Pornhub and other tube sites haven't taken problems like child porn and revenge porn seriously is because doing so would undermine the foundation of their business: stolen content.
On the other hand, this might ultimately work in their favor if it has the effect of raising the bridge they've just crossed. If it gets harder host stolen content, Pornhub has to worry less about someone else doing to them what they've done to others now that they are making a switch to paid content.