I definitely don't want my traffic light controller to be written using manual memory management if this is at all possible to avoid. Waiting another millisecond for the light to turn green feels like an acceptable cost. But this seems silly: how on earth did you write such a simple program to have so many allocations that the gc significantly impacts performance? Why would a traffic controller need variable memory in the first place? Surely it's just a mix of a state machine (statically allocatable) and I/O.
"Determinism" feels like a very odd pitch for manual memory management when the latter in no way implies the former, and lack of manual memory management in no way implies non-determinism. Generally, any dynamic allocation is non-deterministic. Furthermore, in the HFT context the non-determinism of the network is going to absolutely dwarf any impacts GC has, especially if you have ever heard of arena allocation. Even your OS's scheduler will have larger impacts if you make any efforts to avoid memory churn.
Now, an interrupt handler should never allocate memory and should generally run with a constant number of cycles. But that's an extremely niche interest, and you'd probably want to hand-code those instructions regardless.
(FYI, I work in a support role to a HFT product, among many others, but it runs on the JVM)
But suppose the very top of stack is high frequency trading system or traffic light controller. Car brakes...
Depending on your stack, determinism may or may not be a key part. And that is only possible if determinism is guaranteed all the way down.