Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The Fourth Amendment didn't help here, unfortunately. Or, perhaps fortunately.

Still, 25 years for possessing kiddie porn, damn.



The harshness of sentence is not for the action of keeping the photos in itself, but the individual suffering and social damage caused by the actions that he incentivizes when he consumes such content.


Consumption per se does not incentivize it, though; procurement does. It's not unreasonable to causally connect one to the other, but I still think that it needs to be done explicitly. Strict liability for possession in particular is nonsense.

There's also an interesting question wrt simulated (drawn, rendered etc) CSAM, especially now that AI image generators can produce it in bulk. There's no individual suffering nor social damage involved in that at any point, yet it's equally illegal in most jurisdictions, and the penalties aren't any lighter. I've yet to see any sensible arguments in favor of this arrangement - it appears to be purely a "crime against nature" kind of moral panic over the extreme ickiness of the act as opposed to any actual harm caused by it.


> Consumption per se does not incentivize it,

It can. In several public cases it seems fairly clear that there is a "community" aspect to these productions and many of these sites highlight the number of downloads or views of an image. It creates an environment where creators are incentivized to go out of their way to produce "popular" material.

> Strict liability for possession in particular is nonsense.

I entirely disagree. Offenders tend to increase their level of offense. This is about preventing the problem from becoming worse and new victims being created. It's effectively the same reason we harshly prosecute people who torture animals.

> nor social damage involved in that at any point,

That's a bold claim. Is it based on any facts or study?

> over the extreme ickiness of the act as opposed to any actual harm caused by it.

It's about the potential class of victims and the outrageous life long damage that can be done to them. The appropriate response to recognizing these feelings isn't to hand them AI generated material to sate their desires. It's to get them into therapy immediately.


> > Strict liability for possession in particular is nonsense.

> I entirely disagree. Offenders tend to increase their level of offense.

For an example of unintended consequences of strict liability for possession look at Germany where the legal advice for what to do if you come across CSAM is to delete it and say nothing because reporting it to the police would incriminate you for possession and if you deleted a prosecutor could charge you with evidence tampering on top of it.

Also, as I understand it, in the US there have also been cases of minors deliberately taking "nudes" or sexting with other minors leading to charges of production and distribution of CSAM for their own pictures they took of themselves.

The production and distribution of CSAM should 100% be criminalized and going after possession seems reasonable to me. But clearly the laws are lacking if they also criminalize horny teenagers being stupid or people trying to do the right thing and report CSAM they come across.

> The appropriate response to recognizing these feelings is [..] to get them into therapy immediately.

Also 100% agree with this. In Germany there was a widespread media campaign "Kein Täter werden" (roughly "not becoming a predator") targeting adults who find sexually attracted to children. They anonymized the actors for obvious reasons but I like that they portrayed the pedophiles with a wide range of characters from different walks of life and different age groups. The message was to seek therapy. They provided a hotline as well as ways of getting additional information anonymously.

Loudly yelling "kill all pedophiles" doesn't help prevent child abuse (in fact, there is a tendency for abusers to join in because it provides cover and they often don't see themselves as the problem) but feeding into pedophilia certainly isn't helpful either. The correct answer is therapy but also moving away from a culture (and it is cultural not innate) that fetishizes youth, especially in women (sorry, "girls"). This also means fighting all child abuse, not just sexual.

> It's effectively the same reason we harshly prosecute people who torture animals.

Alas harsh sentencing isn't therapy. Arguably incarceration merely acts as a pause button at best. You don't get rid of nazis by making them hang out on Stormfront or 8kun.


> and it is cultural not innate

> he/they

I sometimes see people make this assertion, and interestingly enough it's usually trans people. What exactly makes you say this?


I'm not trans, so I can't speak for trans people (not that any individual person could speak for an entire demographic group). And to pre-empt the follow-up question "then what's with the pronouns": gender is multi-faceted and complex, pronouns are just one aspect of it. Think of it as gender non-conformance. Like wearing a dress as a bloke.

If I had to hazard a guess for what might be causing the correlation you're seeing, I'd assume that being trans usually comes downstream from reflecting on social phenomena and cultural expectations. Being trans is by definition not the cultural norm (even the word itself implies some form of "misalignment" of identities and cultural expectations) so if you are familiar enough with it to claim it as your own identity, you probably did a lot of research into it, especially if you're from a generation where it was more of a taboo subject and not acknowledged in the broader frame of cultural references (e.g. good luck if your only exposure to the concept is from films like Ace Ventura). This can lead you down a rabbit hole if you try to understand not just that one aspect of your identity and experience.

Your actual question comes off as a bit upset (but again, that may be cultural - I'm not American nor is English my native language although I did pick it up at an early age) so let me rephrase it in a way that makes me more inclined to answer it: "Why do you think that is?"

This still feels somewhat like proving the null hypothesis as "it's in our genes" is not normally the go-to explanation we accept when wondering about any random part of human behavior but let's start by turning it around. Sure, we can make up all kinds of just-so evopsych rationalizations why human males should be sexually attracted to post-pubescent young and healthy human females but the same reasoning would also predict a preference for a prolific pelvis (making it more likely they successfully give birth) and pregnant women (demonstrating actual fertility) or mothers (demonstrating child-rearing abilities as well as fertility) and so on. Ultimately these are all just-so stories to rationalize a pre-existing assumption about human behavior that contradicts actual archeological research (which adherents often explain by claiming archeology has been corrupted by ideology but let's not get into fallacious claims of being "free of ideology" and where all of that ultimately leads).

The answer then is simple: I say fetishization of youth in women is cultural rather than innate because it is not a consistent phenomenon throughout history nor even globally in the modern age.

It's important to distinguish between the two factors at play in child sexual abuse: sexual attraction (i.e. pedophilia) and power dynamics. This isn't unique to child sexual abuse. Regular rape also often is more about power than attraction. Everyone is familiar with the concept of prison rape and historically, sometimes even today, a male rapist of other men in a prison is not by default considered gay or effeminate and the act may be seen as demonstrating dominance, demeaning and emasculating the victim.

The reason I'm talking about "fetishization" is because our culture (and US culture particularly so) first of all very much embraces narratives of dominance as a positive, from competition over cooperation to the ahistorical "great man" narrative of historical events. This shouldn't be surprising as these narratives are useful to those benefitting from the status quo by placating those who don't, much like fear of hellfire and the promise of heaven placated those caught at the wrong end of medieval Europe's "divine right"-based feudal system (up to a point).

Our culture is very much male-centric (patriarchy is often misunderstood - even by some so-called feminists - to mean that all men are given power over all women but that's literally why intersectionality became a thing before being misrepresented in "oppression olympics" memes, so I'll avoid overloaded terms like those here). This goes hand-in-hand with the "traditional" perspective that the man/father is the head of the household and should rule it with determination and "tough love" the same way the state should lead the people (and the president the state), each family representing a scale model of the dynamics of society at large, justifying the authority of the state in the authority of the father and vice versa.

So youth and feminity in this case acts as a stand-in for submissiveness. Under the "loving care" of a controlling father figure, a youthful woman is sexually pure/innocent ("uncorrupted") and meek/submissive. By evoking signifiers of childhood (e.g. the quintessential "cheerleader" costume, braces, pigtails, lispy speech, lollipops, pastels/pink) this is shifted further into an implausibly childlike innocence and paired with the sexual allure of "corrupting" that innocence (the fantasy of "defloration" leaving a "permanent mark") based on the implicit understanding that the sexual act empowers the penetrating man and permanently devalues the penetrated "girl" lest she remains faithful to the man should he want to "keep" her. Note that we don't even need to adopt the "sex-negative" feminist perspective on penetrative sex as inherently humiliating, the idea of penetrating = empowering and penetrated = disempowered is almost omnipresent in our culture as it is (note that this has nothing to do with passivity - receiving oral sex for example is seen as empowering - and arguably the framing around literal "penetration" alone is imprecise as e.g. right-wing attitudes towards cunnilingus as being emasculating for a male "giver" show).

If all this cultural analysis is too wishy-washy for you, historical records still don't align with the idea that fetishization of female youth is innate. Young adults, i.e. women in their 20s or very early 30s, yes, sure, but not "sweet 16" or "barely legal". Arguably US culture has even gotten better in this over my lifetime given that we went from the early Britney Spears school uniform sexualization to Megan Thee Stallion and with the crackdown on public forums like the `r/jailbait` Subreddit, but there is still a very strong undercurrent, especially among conservative men.


> then what's with the pronouns

It's been the case that when I encounter people with non-normative pronouns they're trans, but you're right that isn't necessarily the case. My mistake!

I know I asked the initial question, but I guess I'm confused what exactly this conversation is about. Is the idea that people are only ever attracted to sixteen year olds because they learned to be? That feels like a challenging thing to demonstrate in the same way it being "in the genes" is, but perhaps I'm being overly reductive.


Nature vs nurture is not an either-or. I'm not saying "it" isn't "in the genes". I'm saying it's not just genes.

There's a wide range of possible age brackets, body types etc across all genders that can manifest traits most people would find attractive. Post-pubescent girls arguably aren't special in that sense. Especially if you don't isolate them out of their real-world context (which is where it stops being Oscar-winning Hollywood cinema and starts being child sexual abuse) that allows objectifying and dehumanizing them as "jailbait".

Where culture comes in is meaning. Taken at face value, a kid is just a kid. But culturally a kid represents something - naivety, hope, innocence, inexperience, whatever. This turns female youth into a fetish - something imbued with additional meaning. It's not actually the literal youthfulness that is culturally attractive in women (or else most people wouldn't react so violently against the idea of people sexually abusing minors), it's what that youthfulness represents. It's a male power fantasy.

Again, power fantasies aren't inherently a problem. What I'm arguing is that this one very much is a problem because it's so normalized it informs real-world social dynamics, i.e. where people start to forget it's a fantasy. Also I would argue the need for this specific fantasy is also not inherent (i.e. maleness does not inherently create a desire for absolute dominance over others). But I've rambled enough as it is.


> It can. In several public cases it seems fairly clear that there is a "community" aspect to these productions and many of these sites highlight the number of downloads or views of an image. It creates an environment where creators are incentivized to go out of their way to produce "popular" material.

So long as it's all drawn or generated, I don't see why we should care.

> I entirely disagree. Offenders tend to increase their level of offense.

This claim reminds me of similar ones about how video games are "on-ramp" to actual violent crime. It needs very strong evidence to back, especially when it's used to justify harsh laws. Evidence which we don't really have because most studies of pedophiles that we have are, by necessity, focused on the ones known to the system, which disproportionally means ones that have been caught doing some really nasty stuff to real kids.

> I entirely disagree. Offenders tend to increase their level of offense. This is about preventing the problem from becoming worse and new victims being created. It's effectively the same reason we harshly prosecute people who torture animals.

Strict liability for possession means that you can imprison people who don't even know that they have offending material. This is patent nonsense in general, regardless of the nature of what exactly is banned.

> That's a bold claim. Is it based on any facts or study?

It is based on the lack of studies showing a clear causal link. Which is not definitive for the reasons I outlined earlier, but I feel like the onus is on those who want to make it a crime with such harsh penalties to prove said causal link, not the other way around.

Note also that, even if such a clear causal link can be established, surely there is still a difference wrt imputed harm - and thus, culpability - for those who seek out recordings of genuine sexual abuse vs simulated? As things stand, in many jurisdictions, this is not reflected in the penalties at all. Justice aside, it creates a perverse incentive for pedophiles to prefer non-simulated CSAM.

> It's about the potential class of victims and the outrageous life long damage that can be done to them. The appropriate response to recognizing these feelings isn't to hand them AI generated material to sate their desires. It's to get them into therapy immediately.

Are you basically saying that simulated CSAM should be illegal because not banning it would be offensive to real victims of actual abuse? Should we extend this principle to fictional representations of other crimes?

As far as getting them into therapy, this is a great idea, but kinda orthogonal to the whole "and also you get 20+ years in the locker" thing. Even if you fully buy into the whole "gateway drug" theory where consumption of simulated CSAM inevitably leads to actual abuse in the long run, that also means that there are pedophiles at any given moment that are still at the "simulated" stage, and such laws are a very potent deterrent for them to self-report and seek therapy.

With respect to "handing them AI-generated material", this is already a fait accompli given local models like SD. In fact, at this point, it doesn't even require any technical expertise, since image generator apps will happily run on consumer hardware like iPhones, with UI that is basically "type what you want and tap Generate". And unless generated CSAM is then distributed, it's pretty much impossible to restrict this without severe limitations on local image generation in general (basically prohibiting any model that knows what naked humans look like).


> Are you basically saying that simulated CSAM should be illegal because not banning it would be offensive to real victims of actual abuse?

No, it's because it will likely lead to those consuming it turning to real life sexual abuse. The behavior says "I'm attracted to children." We have a lot of good data on where this precisely leads to when entirely unsupervised or unchecked.

> but kinda orthogonal to the whole "and also you get 20+ years in the locker" thing.

You've constantly created this strawman but it appears nowhere in my actual argument. To be clear it should be like DUIs, with small penalties on first time offenses increasing to much larger ones upon repetition of the crime.

> it's pretty much impossible to restrict this

Right. It's impossible to stop people committing murder as well. It's also impossible to catch every perpetrator. Yet we don't strain ourselves to have the laws on the books, and it's quite possible the laws, and the penalties themselves, have a "chilling effect" when it comes to criminality.

Or, if your sensibility of diminished rights is so offended, then it can be a trade. If you want to consume AI child pornography you have to voluntarily add your name to a public list. Those on this list will obviously be restricted from certain careers, certain public settings, and will be monitored when entering certain areas.

Which sounds more appropriate to you?


> No, it's because it will likely lead to those consuming it turning to real life sexual abuse. The behavior says "I'm attracted to children." We have a lot of good data on where this precisely leads to when entirely unsupervised or unchecked.

For one thing, again, we don't have quality studies clearly showing that.

But let's suppose that we do, and they agree. If so, then shouldn't the attraction itself be penalized, since it's inherently problematic? You're essentially saying that it's okay to nab people for doing something that is in and of itself harmless, because it is sufficient evidence that they will inevitably cause harm in the future.

I do have to note that it is, in fact, fairly straightforward to medically diagnose pedophilia in a controlled setting - should we just routinely run everyone through this procedure and compile the "sick pedo list" preemptively this way? If not, why not?

> You've constantly created this strawman but it appears nowhere in my actual argument.

My "strawman" is the actual situation today that you were, at least initially, trying to defend.

> Right. It's impossible to stop people committing murder as well. It's also impossible to catch every perpetrator. Yet we don't strain ourselves to have the laws on the books, and it's quite possible the laws, and the penalties themselves, have a "chilling effect" when it comes to criminality.

That can be measured, and we did - and yes, they do, but it's specifically the likelihood of getting caught, not so much the severity of the punishment (which is one of the reasons why we don't torture people as form of punishment anymore, at least not officially).

The point, however, was that nobody is "handing" them anything. It's all done with tools that are, at least at present, readily available and legal in our society, and this doesn't change whether you make some ways of using those tools illegal or not, nor is it impossible to detect such private use unless you're willing to go full panopticon or ban the tools.


Laws don't need to be absolutely enforceable to still work. You probably will not go to jail for running that stop sign at the end of your street (but please don't run it).


AI-generated CSAM is real CSAM and should be treated that way legally. The image generators used to generate it are usually trained on pictures of real children.


[flagged]


Icky things were historically made illegal all the time, but most of those historical examples have not fared well in retrospect. Modern justice systems are generally predicated on some quantifiable harm for good reasons.

Given the extremely harsh penalties at play, I am not at all comfortable about punishing someone with a multi-year prison sentence for possession of a drawn or computer generated image. What exactly is the point, other than people getting off from making someone suffer for reasons they consider morally justifiable?


There's no room for sensible discussion like this in these matters. Not demanding draconian sentences for morally outraging crimes is morally outraging.


I think their point was they think the law should be based off of harms, not necessarily "morals" (since no one can seem to decide on those).


GP is saying that people who want this to be a crime are morally outraged that someone else might disagree, and so it's impossible to have a reasonable debate with them about it. They're probably correct, but it never hurts to try.


Oof, I fell victim to Poe's law


Assuming the person is a passive consumer with no messages / money exchanged with anyone, it is very hard to prove social harm or damage. Sentences should be proportional to the crime. Treating possession of cp as equivalent of literally raping a child just seems absurd to me. IMO, just for the legal protection of the average citizen, a simple possession should never warrant jail time.


CP is better described as "images of child abuse", and the argument is that the viewing is revictimising the child.

You appear to be suggesting that you shouldn't go to prison for possessing images of babies being raped?


For the record, i'm against any kind of child abuse, and 25 years for an actual abuser would not be a problem.

But...

Should you go to prison for possesing images of an adult being raped? What if you don't even know it's rape? What if the person is underage, but you don't know (looks adult to you)? What about a murder video instead of rape? What if the child porn is digitally created (AI, photoshop, whatever)? What if a murder scene is digitally created (fake bullets, holes+blood made in video editing software)? What if you go to a mainstream porno store, buy a mainsteam professional porno video and you later find out that the actress way a 15yo Traci Lords?


You don't go to prison for possessing images of adults being raped, last I checked. Or adults being murdered. Or children being murdered.

I don't think making the images illegal is a good way to handle things.


You do in some countries. For instance, knowingly possessing video of the Christchurch massacre is illegal in New Zealand, due to a ruling by NZ’s Chief Censor (yes, that’s the actual title), and punishable by up to 14 years in prison.

Personally, I prefer the American way.


It's a reasonable argument, but a concerning one because it hinges on a couple of layers of indirection between the person engaging in consuming the content and the person doing the harm / person who is harmed.

That's not outside the purview of US law (especially in the world post-reinterpretation of the Commerce Clause), but it is perhaps worth observing how close to the cliff of "For the good of Society, you must behave optimally, Citizen" such reasoning treads.

For example: AI-generated CP (or hand-drawn illustrations) are viscerally repugnant, but does the same "individual suffering and social damage" reasoning apply to making them illegal? The FBI says yes to both in spite of the fact that we can name no human that was harmed or was unable to give consent in their fabrication (handwaving the source material for the AI, which if one chooses not to handwave it: drop that question on the floor and focus on under what reasoning we make hand-illustrated cartoons illegal to possess that couldn't be applied to pornography in general).


> The FBI says yes to both in spite of the fact that we can name no

They have two arguments for this (that I am aware of). The first argument is a practical one, that AI-generated images would be indistinguishable from the "real thing", but that the real thing still being out there would complicate their efforts to investigate and prosecute. While everyone might agree that this is pragmatic, it's not necessarily constitutionally valid. We shouldn't prohibit activities based on whether these activities make it more difficult for authorities to investigate crimes. Besides, this one's technically moot... those producing the images could do so in such a way (from a technical standpoint) that they were instantly, automatically, and indisputably provable as being AI-generated.

All images could be mandated to require embedded metadata which describes the model, seed, and so forth necessary to regenerate it. Anyone who needs to do so could push a button, the computer would attempt to regenerate the image from that seed, and the computer could even indicate that the two images matched (the person wouldn't even need to personally view the image for that to be the case). If the application indicated they did not match, then authorities could investigate it more thoroughly.

The second argument is an economic one. That is, if a person "consumes" such material, they increase economic demand for it to be created. Even in a post-AI world, some "creation" would be criminal. Thus, the consumer of such imagery does cause (indirectly) more child abuse, and the government is justified in prohibiting AI-generated material. This is a weak argument on the best of days... one of the things that law enforcement efforts excel at is just this. When there are two varieties of a behavior, one objectionable and the other not, but both similar enough that they might at a glance be mistaken for one another, is that it can greatly disincentivize one without infringing the other. Being an economic argument, one of the things that might be said is that economic actors seek to reduce their risk of doing business, and so would gravitate to creating the legal variety of material.

While their arguments are dumb, this filth's as reprehensible as anything. The only question worth asking or answering is, were (AI-generated) it legal, would it result in fewer children being harmed or not? It's commonly claimed that the easy availability of mainstream pornography has reduced the rate of rape since the mid-20th century.


> the individual suffering and social damage caused by the actions that he incentivizes

That's some convoluted way to say he deserves 25 years because he may (or may not) at some point in his life molest a kid.

Personally i think that the idea of convicting a man for his thoughts is borderline crazy.

User of child pornography need to be arrested, treated, flagged and receive psychological followup all along their lives, but sending them away for 25 years is lazy and dangerous because when he will get out he will be even worst than before and won't have much to loose.


Respectfully, it's not pornography, it's child sexual abuse material.

Porn of/between consenting adults is fine. CSAM and sexual abuse of minors is not pornography.

EDIT: I intended to reply to the grandparent comment


Pornography is any multimedia content intended for (someone's) sexual arousal. CSAM is obviously a subset of that.


That is out of date

The language has changed as we (in civilised countries) stop punishing sex work "porn" is different from CASM

In the bad old days pornographers were treated the same as sadists


The language is defined by how people actually use it, not by how a handful of activists try to prescribe its use. Ask any random person on the street, and most of them have no idea what CSAM is, but they know full well what "child porn" is. Dictionaries, encyclopedias etc also reflect this common sense usage.

The justification for this attempt to change the definition doesn't make any sense, either. Just because some porn is child porn, which is bad, doesn't in any way imply that all porn is bad. In fact, I would posit that making this argument in the first place is detrimental to sex-positive outlook on porn.


> Just because some porn is child porn, which is bad, doesn't in any way imply that all porn is bad.

I think people who want others to stop using the term "child porn" are actually arguing the opposite of this. Porn is good, so calling it "child porn" is making a euphemism or otherwise diminishing the severity of "CSAM" by using the positive term "porn" to describe it.


I don't think the established consensus on the meaning of the word "porn" itself includes some kind of inherent implied positivity, either; not even among people who have a generally positive attitude towards porn.


"Legitimate" is probably a better word. I think you can get the point though. Those I have seen preferring the term CSAM are more concerned about CSAM being perceived less negatively when it is called child porn than they are about consensual porn being perceived more negatively.


Is it still okay to say "pirated movies" or is that not negative enough since movies are okay? Should we call it "intellectual property theft material"?


> The language is defined by how people actually use it,

Precisely

Which is how it is used today

A few die hard conservatives cannot change that


Stop doing this. You are confusing the perfectly noble aspect of calling it abuse material to make it victim centric with denying the basic purpose of the material. The people who worked hard to get it called CSAM do not deny that it’s pornography for its users.

The distinction you went on to make was necessary specifically for this reason.


In that case, we should all get 25 years for buying products made with slave labour.


Who do such harsh punishments benefit?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: