>A decade ago, Python was pretty rough around the edges in production.
Not really.
Environment/dependency management is/was never an actual problem. People act like you update a version and stuff breaks. Even then, venv existed for this reason, and you could specify version in the setup.py or requirements.txt.
Type safety should in theory be covered by unit tests. If you assign a variable to another one in a dynamic setting accidentally, like a string to a dict, then your functionality breaks. In reality though, its really not that hard to use different variable names. In my experience, the only time things start getting confusing is if you start using Java concepts with things like Factories and a lot of OOP and inheritance in Python. None of that stuff is really necessary, you can get by with dicts for like 90% of the data transfer. And in very type safe languages, you spend a lot of time designing the actual type system instead of just writing code that does stuff.
Performance is still slow, but language performance doesn't really matter - you can launch natively compiled code easily from Python. Numpy was built around this concept and is also 10 years old. You will never beat C (with explicit processor instructions for things like vector math) or CUDA code, but most of your code doesn't require this level of performance.
- Environment / dependency management
- Type safety
- Performance
As the author points out, these have largely been addressed now by uv, type hints, pydantic, FastAPI, etc.