You have to be joking. I tried Codex for several hours and it has to be one of the worst models I’ve seen. It was extremely fast at spitting out the worst broken code possible. Claude is fine, but what they said is completely correct. At a certain point, no matter what model you use, llms cannot write good working code. This usually occurs after they’ve written thousands of lines of relatively decent code. Then the project gets large enough that if they touch one thing they break ten others.
I beg to differ, and so do a lot of other people. But if you're locked into this mindset, I can't help you.
Also, Codex isn't a model, so you don't even understand the basics.
And you spent "several hours" on it? I wish I could pick up useful skills by flailing around for a few hours. You'll need to put more effort into learning how to use CLI agents effectively.
Start with understanding what Codex is, what models it has available, and which one is the most recent and most capable for your usage.
I don't think such an idea is consistent with the existence of trashbin features, or the non-insignificant use of data recovery tools on normally operating devices.
I can definitely see the perspective in clarifying that ChatGPT didn't lose anything, the person did, but that's about it.
You’re making up your position here though. They have a stable API. It’s the built in wordpress API. It’s on almost every Wordpress site on the planet. It’s about the most stable API you could use. You just want to use html and not be bothered with an API so don’t pretend like you would do something else if it was “stable”.
And that website is hosted somewhere, you’re using several layers of network providers, the registrar has control over your domain, the copper in the ground most likely has an easement controlling access to it so your internet provider literally can just cut off access to you whenever they want, if you publish your apps to a registry the registry controls your apps as well.
There are so many companies that control access to every part of your life. Your argument is meaningless because it applies to _everything_.
A trustless society is not one that anyone should want to be a part of. Regulations exist for a reason.
Not wanting centralization under one company does not equal advocating for "trustless society".
All the things you mentioned (registrars, ISPs, registries, etc) have multiple alternative providers you can choose from. Get cut off from GCP, move to AWS. Get banned in Germany, VPS in Sweden. Domain registration revoked, get another domain.
Lose your Apple ID, and you're locked out of the entire Apple ecosystem, permanently, period.
Even if a US federal court ordered that you could never again legally access the internet, that would only be valid within the US, and you could legally and freely access it by going to any other country.
So in fact, rather than everything being equivalent to Apple's singular control, almost nothing is equivalent (really, only another company with a similarly closed ecosystem).
If aws decided to block your access to their ecosystem you would lose so so so much more than Apple blocking your access to theirs. If the US decided what you said, t1 networks would restrict your access across much of the planet.
Your logic makes no sense since you can easily switch to Google or whatever other smartphone providers there are (China has a bunch).
But of course those providers can also cut you off, so what I said still applies.
First off, AWS cutting off your AWS account does not block you from visiting other websites that use AWS, it just means you can't use AWS itself as a customer. Apple's ecosystem OTOH means that OP's issue with iCloud disabled their account globally across all Apple services, not just within iCloud itself (and in fact, to further illustrate the difference, losing access to your AWS console account doesn't cut off your account for Amazon.com shopping).
> Your logic makes no sense since you can easily switch to Google or whatever other smartphone providers there are (China has a bunch).
The person above was asking about why they *as a developer* would want to risk their time and effort developing for iOS. Any work developing for iOS in e.g. switft or objective-c, is not portable for other platforms like Android. If they lose their Apple account, any time they spent developing for iOS-specific frameworks is totally wasted, is their point.
> If the US decided what you said, t1 networks would restrict your access across much of the planet.
No offense, but you have no clue what you're talking about. There are in fact court orders where internet access is restricted as part of criminal sentencing. Here's a quick example guide [1]. No part of that involves network providers cutting you off.
How on earth do you imagine a "t1 network" provider would determine that a person using their network from the UK is actually a person from the US with a court order against using their network? And to be clear, the court orders don't compel ISPs to restrict access, or attempt to enforce blocks like you are suggesting.
That applies for any device you buy, yes even if you plan on putting graphene on it. So you’re getting downvoted because your comments are pointless and disingenuous. Also, you can jailbreak an iphone so no it’s not bricked.
Yeah literally the exact same thing can happen on android and windows. The solution is regulation, not ridiculous solutions like telling billions of people to back up their own stuff.
I would support legislation that enforces a right to data export for 6 months in human readable file formats, or a physical equivalent like spending a USB stick in the mail.
> I effectively have over $30,000 worth of previously-active “bricked" hardware. My iPhone, iPad, Watch, and Macs cannot sync, update, or function properly.
(I assume these can be re-sold? They do mention that they can't sign out)
> I have lost access to thousands of dollars in purchased software and media.
Should the "purchased" software and media be within the data export scope?
> I don’t have a 6TB device to sync them to, even if I could.
...yeah.
But let's say we limit ourselves to stored bits.
How should the service identify the person asking for data export? Does your regulation imply government id registration for all internet services? Is that what you actually want?
What if the service is e2ee? How do they deliver "human readable file formats"? Are we also banning e2ee?
What do compliance requirements imply for people's ability to start competing services?
You are proposing to replace a very tiny bit of personal responsibility (having backups) with a very intrusive, and highly consequential, legal mechanism.
EDIT: Though I would, of course, support a requirement for these services to properly warn users (on the registration page, not buried in TOS small print), and provide thorough instructions for making backups to external storage connected to any of the devices they support.
> Should the "purchased" software and media be within the data export scope?
I presume most of this is licensed, so no
> Does your regulation imply government id registration for all internet services?
No
> How should the service identify the person asking for data export?
Username, password, pin, MFA, security questions. Anything already in use for identification
> Does your regulation imply government id registration for all internet services?
No
> What if the service is e2ee?
Then the encrypted data is provided
> How do they deliver "human readable file formats"?
It can still become human readable if the user took proper care of their private key
> Are we also banning e2ee?
No
> What do compliance requirements imply for people's ability to start competing services?
If your service can't provide reliable access to backups then presumably you will already not do well competing in any market where user data is valuable. That should be at the forefront of the service model. Unless you don't care about interoperability like Apple
> You are proposing to replace a very tiny bit of personal responsibility (having backups) with a very intrusive, and highly consequential, legal mechanism.
Not really. If export functionality isn't already built out then it should be
Why would you need it to be end to end encrypted anyway? You’re running it. Set it to only upload photos when you’re on your home network and you’re fine. Or fork it and make a PR and make it e2e encrypted.
You can’t just “fork it and make a PR and make it e2e encrypted”. All the features run serverside, e2ee is fundamentally impossible because of its design, of which you seem to know fuck all.
I’m being dismissed by I run a rather large homelab and I still want my photos iCloud like, where end devices decrypt and run ML. Immich is a Google Photos clone where you give it everything and some server does all the magic.
Backblaze doesn’t erase after 30 days… I’ve had a computer be offline from it for several months and it still retained all data. And you can use the backblaze docker container to run on a NAS, much much much cheaper than B2.
Wasabi is much cheaper than AWS as well.
Finally the best solution for backing up your iCloud Photos is definitely Immich. Set it up on your own NAS or a VPS, back up to that, and then back up that server to an S3 storage using rsync or restic. I’ll note that I still backup to Backblaze because its so dang cheap.
I spent months trying to find the best setup a few months ago and this is by far the cheapest.
But still, this shouldn’t be required for normal people. They should get what they pay for.
> It has to phone home every 30 days or it will erase anything that is stored on an external drive
It’s actually more nuanced. It will back up files on a USB attached drive. If it doesn’t see the drive attached for 30 days, it will erase the backup.
If you have your computer off for more than 30 days and you bring your computer back on and the USB drive isn’t attached when it connects to BackBlaze, it will erase it.
Only if you’re backing up nothing and using non-encrypted files and making sure you don’t delete anything (rsync with delete turned off). I tested this not even three months ago. I hit $30 with only 3 tb of data with deep archive while wasabi AND backblaze cost less than that. No need to even trust a single provider. If you’re never changing your files AND you don’t care about encrypting them then yes GDA is fine and pretty cheap. Otherwise wasabi and backblaze get more done for less cost.
I understood what you meant about GDA. It just doesn’t come out to that unless you put stuff in and never touch it, which is a valid use case! Don’t get me wrong, I planned on doing the same but with restic it would cost so so much more than wasabi and backblaze that it was a massive waste of money and really revealed amazon’s strategy, which is lock your data away and charge you to access it.
I wasn’t talking about B2 though, I was talking about Backblaze personal, which you can run on a NAS with a docker container.
The imessage db is literally just a sqlite db. If you have a Mac you can read the entire thing with an applescript. It’s really easy from what I remember from years ago
That is in no way a reasonable suggestion. You’re suggesting a raspberry pi (first red flag) along with a command line program. This is not reasonable in any sense of the word. Imagine me suggesting that everyone should be set up their own unraid server to make sure they can still stream movies and videos if Netflix goes down. Imagine me telling you you should set up a foundry to build your own engines because you can’t trust big car manufacturers. This is the case with everything in your life
Regulations exist because it’s impossible for any one person to handle everything that needs to be handled.
>That is in no way a reasonable suggestion. You’re suggesting a raspberry pi (first red flag) along with a command line program. This is not reasonable in any sense of the word.
Uh, the guy writes programming books for a living.
But since he's all-in Apple he could just use Time Machine to some sort of NAS and get a more streamlined version of the above.
It’s not reasonable because you’re assuming that 1) they have the time to set up that network infrastructure 2) their skills align with that 3) they have the knowledge to do so 4) they live in a country without strong regulations that would make such a thing unnecessary.
Just because you know objective-c doesn’t mean you know a damn thing about raspberry pis, backup programs, NASes, or anything else. It doesn’t mean you know or want to manage your own network infrastructure. They’re a Mac app programmer, not a Linux professional, not a micro-computer professional, not a network engineer, not a sys admin.
Time Machine wouldn’t work here, because it needs the files locally and he’s already stated he doesn’t have a 6tb drive.
1. I am pretty sure OP could manage to plug in an ethernet cable.
2. Again, he should be able to manage to set up a computer and plug in a USB drive, even if not familiar with the particular OS. People are not that narrow
3. I am pretty sure he could manage to install and run some backup software on his devices
4. I assuming you are missing a not there, but regulations have clearly not solved the problem, in fact its likely AML regulations caused it.
> he’s already stated he doesn’t have a 6tb drive
Someone who uses a $500 gift card to renew subscriptions could afford one
So that solves it for OP, but not for every single other person out there who's not as tech proficient and relies on iCloud for backups.
Which is not an unreasonable thing at all considering it's literally marketed as a storage solution for your photos, and a top of that even encourages users to store originals only in the cloud.
It won't solve for people that only own an iOS device but setting up a Time Machine backup is aggressively recommended by OS level notifications for every macOS users.
A simple usb hard drive will actually do, no need for a NAS. The only action required to implement proposed solution is to check "Keep all data on this Mac" in both photos and iCloud Drive settings. And to be extra cautious add a second backup drive from another vendor (to be extra extra cautious don't use Time Machine for the second drive).
For the specific case of thoses that don't have a big enough internal drive they might need to store data on an external drive. But if you do have 6TB of pictures you normally should ask yourself if a RAID1 or RAID6 is not warranted at this stage.
In conclusion it's not a binary decision there is lot of room between "I solely rely on the cloud" and "never trust the cloud".
reply