Many of those cancers are not harmful or would be killed ok their own. And the cost of diagnosis on the patient is not free either. It causes lots of anxiety and stress which also cause large negative health effects too. Over diagnosis is real and also bad. Medical stuff is just really hard
No, I check account history when the text is obviously LLM-generated. I mostly point it out because if I don't make it abundantly clear how obvious the spambot is by its behaviour, I will get people telling me that it could totally be a human and that I'm making a false accusation.
The android app scrollguard helped me. Stops YouTube short, reels and TikTok from being clicked on. It has massive permissions to survey my phone which could be scary. But as an addict you have to admit when you need to check yourself into rehab. And the phone is the drug.
Short form video has been a total break from previous media and social media consumption patterns. Personally I would support a ban on algorithmic endless short form video. It's purely toxic and bad for humanity
People are way too comfortable banning things these days. This is where the term 'nanny state' comes from. A subset of the population doesn't have self control? Ban it everyone. Even if it's a wildly popular form of entertainment with millions of creators sharing their lives, who cares we know better.
Even most liberal societies tend to ban addictive things. Alcohol, smoking, gambling, drugs, they are regulated almost everywhere, in one form or another.
I think that algorithmic social media should be likewise regulated, with at the very minimum ban for minors.
Note that my focus here on the "algorithmic" part. I'm fine with little or no regulation for social media where your feed is just events in chronological order from contacts that you are subscribed to, like an old bullettin board, or the original Facebook.
Also, I think we should consider companies that provide algorithmic social media responsible for what they publish in your feed. The content may be user generated, but what is pushed to the masses is decided by them.
It's way more complex than "no self control". Social media is addictive by design and is peddled at such scale that it is literally impossible to ignore. It's also backed by billions upon billions of dollars.
Pitting the average person up against that, then blaming them for having "no self control" once they inevitably get sucked in is not a remotely fair conclusion.
People keep saying this and yet, I have never used any of these short form video services or really any social media outside of desktop websites like hackernews and reddit. Even on reddit I just subscribe to a few niche and mostly technical subreddits. It seems extremely easy to ignore it all.
Considering the median amount of time people spend on social media daily, it sure does not seem to be so easy for the average person (as was implied in the comment you replied to). I've got a pretty good self control when it comes to the common vices, but I can't see why that would generalise to everyone else.
It's easy for you and me. At the same time, it doesn't seem right to make a business of intentionally going after the people who get addicted to this, like flavored cigs meant to appeal to teenagers. And these social media companies have a paper trail of internal research on user engagement.
But I'm still wary of the motives behind these bans because they seem to be about controlling information, not addiction.
You've learned they're bad for you. What if you grew up with it from the age of 3? You'd grow up with a dependence on it and no chance to see that it's bad.
> People are way too comfortable banning things these days. This is where the term 'nanny state' comes from. A subset of the population doesn't have self control? Ban it everyone. Even if it's a wildly popular form of entertainment with millions of creators sharing their lives, who cares we know better.
Europe wants to ban algorithmic recommendation. You attack a straw-man: banning all the content from creators. If you have any valid argument you should bring them to the discussion instead of creating imaginary enemies.
Banning harmful design patterns is a must to protect citizens even if it ruffles the feathers of those profiting from their addiction.
> A subset of the population doesn't have self control?
please fix this to
A subset of the population who has not yet reached the age of consent
I think society broadly accepts that there are different expectations for children and adults; the line is currently officially drawn somewhere around 18-21 years old.
> But in Europe you can drink at 14. Age of consent is also 14.
That is hilariously general. You're conflating a lot of different nations there. In practice; its different depending on the nation, consent is usually 16 and alcohol is ~18.
its more complicated than that, the age of consent you're listing doesn't necessarily universally apply and is mostly that low when both people are within that age bracket. In Germany for example the age of consent is effectively 16, its more that there's wiggle room to 14, if both parties are under 21.
The videos are the entertainment, not the endless recommendation algorithm.
Additionally, this is not about self control. The claim is that the algorithm is designed to exploit users. Insiders (including a designer of infinite scroll!) have admitted as much going back years: https://www.bbc.com/news/technology-44640959
We should be uncomfortable with companies spending huge amounts of money to research and implement exploitative algorithms. We did something about cigarette companies advertising to kids. This action is along those lines.
I would much rather people not break things down into false dichotomies. Also, we should strive to give our children at least "good" options, and not settle for "less bad".
When most of the market using it is abusive, and a source of abuse, preventing the abuse to continue while it's being investigated, or better apprehended by the population/generations at large, makes sense.
Hard not to think of the "hard times create strong people, strong people create good times, good times create weak people, weak people create hard times" meme here.
The thing is, people who live in Europe actually like that companies aren't allowed take advantage of people in every way concievable.
I have an ideia, if you don't like regulation that protects people why don't you fuck off to your own country and advocate for it in whatever dystopian hellhole you came from?
1. The reactions to banning drunk driving: "It's kind of getting communist when a fella can't put in a hard day's work, put in 11 to 12 hours a day, and then get in your truck and at least drink one or two beers."
2. Mandatory seatbelts: "This is Fascism"
You're going to balk at just about anything that comes down the line - I guarantee it.
The "subset of the population" is not small, and there is no easy way to protect the most vulnerable.
> it's a wildly popular form of entertainment with millions of creators sharing their lives
I don't think we should be rewarding those who make a living by creating "content" that serves for nothing but a dopamine rush, and you can bet that those who who put it in the effort to create valuable content would prefer to have one less channel where they are forced to put out content just to satisfy the algorithm overlords.
It's not about the content, but the format and the economic pressure that corporations exert over everyone.
If you want to distribute short videos on a website that let's you choose what you want after search and deliberately clicking on a button to play it, by all means feel free to do it. But the current Tik-Tok mechanism removes all agency and are an extreme version of mind pollution.
One way is criminalizing the victims, another is going after the platforms. I'm willing to wager a bet on who will be the ones receiving the enforcements here :)
He meant that as an indicating on how they're enforcing things. If the whole "arrest the victim" thing was as grand-parent "joked about", they wouldn't go after X France, but instead whoever was viewing the content instead.
Magnitude is also important. If you overpay by 100% on a stick of gum that's bad deal but I likely to have any large consequences for you. But if you overpay for your house by 100% you're probably in a world of pain
> Reduced demand for oil reduces the quantity of oil extracted
That is not true. Reduced price leads to higher demand. This is economics 101.
> The price drops and hardware to extract oil stops being produced
Oil extraction costs differ vastly amongst countries, and there is a lot of potential for increased productivity and efficencies when the margins become lower - price pressure is a driver for innovation. And countries like Saudi Arabia and Russia have a very high incentive to keep extracting oil and sell it, because their economy relies on it.
Because there, you might have learned that the basic economic principles you describe as "economics 101" are the equivalent of the "spherical cow in a frictionless vacuum"-type examples you get in introductory physics classes.
In the real world, demand is affected by all kinds of things, and sometimes, a product or service is just no longer desired by the population. Do you think that if you were selling buggy whips for $0.05 each, you'd be able to make a profit on them today? Of course not, because people don't need them. You'd barely sell any, and those purely as a novelty.
While there's still a lot of work to do to make it fully possible, and certain political groups are actively working against it, the world at large recognizes that getting off of fossil fuels is an important goal. Demand for oil is going to continue to drop—maybe not monotonically, but overall—regardless of what the price of oil does.
Your entire point hinges on EVs idrying up the oil market that there is no more demand. But ICEs don't run on oil, they run on gas or diesel. Oil is used in far more industries (aviation, ships). Plus, you cannot account for business models that become viable, that haven't been viable before. People come up with new ideas all the time.
My point is not that if there are no more ICE cars, there will be no market for oil. It's that the portion of the market for oil that has, heretofore, serviced ICE cars is starting to disappear, and it won't be coming back, nor will it be repurposed to service anything else, to any significant degree.
The broader point is that, because of the environmental consequences, people all over the world are diligently working on ways to eliminate the other markets for oil. And they will also not be coming back.
Oil is on its way out, period. Not this year, not this decade, possibly not this century...but it's going. And the world will be much, much better off for it.
(It's just possible that there will be some tiny fraction of the current uses for oil that we can't find any meaningful replacement for, but it's not going to be much.)
People are way too worried about security imo. Statistically, no one is targeting you to be hacked. By the time you are important and valuable enough for your home equipment to be a target you would have hired someone else to manage this for you
I think this is very dangerous perspective. A lot of attacks on infra are automated, just try to expose a Windows XP machine to the internet for a day and see with how much malware you end up with. If you leave your security unchecked, you will end up attacked; not by someone targeting you specifically, but having all your data encrypted for ransom might still create a problem for you (even if the attacker doesn’t care about YOUR data specifically).
Once, when I was young and inexperienced, I left a server exposed to the Internet by accident (I accidentally exposed a user with username postgres, password postgres). In hours the machine had been hacked to run a botnet. Was I stupid? Yes. But I absolutely wasn't a high-profile enough person to "be a target" - clearly someone was just scanning IP addresses.
I would be that Openai and Google will find a way to boost the embedded ad in the llm result to you based on an auction on how valuable you and your query are
reply