I don't think Django could have talked to sqlite over the network, or would have really solved any problems. When I say physically separate, I mean it in the pre-cloud sense: one front end server (Django) dynamically connecting to a database on a remote server (based on the user account), which had a database continuously logging streams of sensor data from local IoT devices.
What I did was change the architecture to support centralized aggregation in a large co-located Postgres server, implementing security to managed permissions for writers (sensor gateway) and readers (user accounts).
Eventually we had to write a pg backend similar in concept to timescaledb to handle the massive scale of sensor data we were ingesting...but that's another story entirely.
Django has supported database routing for ages. I've abused it for setups like yours (at least as I vaguely understand from the post details) to avoid using a datalake
It was added over a year after we needed it (back in 2008/2009). I'm pretty sure our use case was taken as an input into the design of Django's database routing. We were certainly loud enough about it.
didnt they hear about sqlite, great for this setup