So they basically put a Wrapper around Claude in a Container, which allows you to send messages from WhatsApp to Claude, and act somewhat as if you had a Siri on steriods.
No, how this works is people sync their Google Calendar and Gmail to have it be their personal assistant, then get their data prompt injected from a malicious “moltbook” post.
Only if you let it. And for those who do, a place where thousands of these agents congregate sounds like a great target. It doesn’t matter if it’s on a throwaway VPS, but people are connecting their real data to these things.
I am. Modern computers and network connections are so fast that this amounts to literally nothing. It's standard internet background noise and it's really not a problem.
> He probably thinks the internet works on static 5k html pages, while the norm is 100kb, dynamically generated pages.
I just work on web stuff that people actually use. It's 2026, thousands of requests per second is nothing. You'll probably be fine even with stock apache2 and some janky php scripts.
A single gbit line will serve a 100kB page thousand times a second without issues.
Dynamically generated pages you can't easily serve at rates in excess of tens of thousands of requests per second from commodity hardware are extremely rare.
i don’t think you realize how fast modern CPUs are. If this stresses your server out, you probably have no business hosting things publicly on that server. This person is hosting stuff on Vercel using serverless which is the root of their problem.
4 request per second is just noise. it’s like complaining about car noise when deciding to buy a house next to the freeway. Exposing things publicly on the internet means _anyone_ can try talking to your server. Real users, bots, hackers, whatever. You can’t guarantee bots are bug-free!
Dynamic content is _typically_ served to logged in users. Content that is public facing is typically cached, for obvious reasons. Of course Meta should fix this…but using Vercel and serverless in this manner is a very poor choice.
Meta isn’t going to fix this because they have your mindset.
Meanwhile, my website with 48M pages over 8 domains is getting hammered with over 200 req/s 24/7 from AI bots in addition to the regular search engine bots. It seems like every day new bots appear that all want to download every single one of my URL’s.
To me it’s not background noise. It’s a problem. It simply requires a lot of CPU power and traffic. I could do with 95% less resources and have faster response times for my actual users if these bots would just bugger off.
even 100 kB dynamically generated pages should be a piece of cake. if it's CRUD like (original op's site is), it should be downright trivial to transfer that much on like... shared hosting (although even a VPS would be much better).
(in original op's case, i clocked 197 requests using 20.60 MB while browsing their site for a little bit. most of it is static assets and i had caching disabled so each new pageload loaded stuff like the apple touch icons.)
honestly you could probably put it behind nginx for the statics and just use bog standard postgres or even prolly sqlite. nice bonus in that you don't have to worry about cold start times either!
The "30 days or it dies" constraint came from noticing that my abandoned blogs stressed me out more than not having a blog at all. They just sat there, judging me.
Lapse is an attempt to make abandonment the default. If you stop, it disappears cleanly. No guilt, no digital graveyard.
Curious if this resonates with anyone or if I've just built a very niche anxiety management tool.
The main value proposition of these full-access agents is that they have access to your files, emails, calendar etc. in order to manage your life like a personal assistant. No amount of containerization is going to prevent emails being siphoned off from prompt injection.
You probably haven't given it access to any of your files or emails (others definitely have), but then I wonder where the value actually is.
> because they’re trying to normalise the AI’s writing style,
AIs use em dashes because competent writers have been using em dashes for a long time. I really hate the fact that we assume em dash == AI written. I've had to stop using em dashes because of it.
Likewise, I’m now reluctant to use any em dashes these days because unenlightened people immediately assume that it’s AI. I used em dashes way before AI decided these were cool
LaTeX made writing Em dashes very easy. To the point that I would use them all the times in my academic writing. It's a shame that perfectly good typography is now a sign of slop/fraud.
Right. So I guess that's my quibble with the term sacrifice (shared by Rudolf Spielmann)
But what's interesting to me is the counterfactual like outside of these 3 queen moves he would have lost the entire advantage. So it was like a tactical shot like capturing the golden snitch in Harry Potter
Sure, I get what you're saying. It's still a sacrifice, but the compensation is just mate in 2, so the there's no real "sacrifice" here.
That being said, any sacrifice that doesn't guarantee a better (or at least equal) position isn't a sacrifice either, it's just "hope chess", aka a bad move. In Blitz or Bullet you can make the case for a "bad" sacrifice for positional complexity and putting time pressure on your opponent to make accurate defensive moves.
In the Opera game, Black just played a poor game start to finish. Giving up the bishop for the knight, pushing the B pawn while the king wasn't castled.
I had two “brilliant” moves in one chess.com game today. One was a bishop sacrifice that would have led to mate in three. The other was a queenside castle that the engine wanted me to do sooner. I suck at chess, although I did see the bishop sacrifice as the right move. The engine rated me at 1500 for the game.
I’m confused as to what these claw agents actually offer.
reply