The author doesn't have to answer that question to dispute Deutsch's alleged answer.
Secondly, Deutsch's challenge of explaining Shor's algorithm presupposes that quantum computation requires an explanation in terms of classical computation. While I'm sympathetic to that view, this assumption is easily rejected by people who don't view reality as fundamentally classical or local. So for these people, there is no challenge to meet.
Thirdly, while you can speculate that Shor's algorithm will scale to factoring numbers so large they require more atoms than are in the universe, no one has demonstrated that this is the case. Just because our current models describe this happening, that doesn't mean the model corresponds to what will actually happen in reality. It could easily be the case that the model is not accounting for noise or other factors that will prevent entanglement from scaling to the levels you describe. This is the position of some determinists, in which case Deutsch's challenge is also neutered.
Finally, other interpretations of QM can also provide explanations for speedups. For instance, any interpretation of QM that accepts its non-locality has an escape hatch via relativity: non-locality is effectively time travel in GR, but in a form that cannot be exploited for superluminal signalling. There are many other possible answers given by other interpretations of QM.
I personally think it's an interesting question, but it's not a compelling argument for many worlds, not least because the "many worlds as parallel computations" doesn't actually work beyond trivial examples.
> Computation is real - it requires matter and energy.
Yes, classical bits require a certain amount of matter and energy, but qubits do not have the same matter and energy requirements, which seems to be what you're expecting. If you expect there to be an answer of this sort, then I think you must give up believing that quantum computation will scale. Basically, you are expecting reality to actually be classical, and so have some deterministic classical computation happening behind the scenes (hidden variables), and these hidden variables will more than likely disrupt scaling quantum computations.
The author doesn't have to answer that question to dispute Deutsch's alleged answer.
Secondly, Deutsch's challenge of explaining Shor's algorithm presupposes that quantum computation requires an explanation in terms of classical computation. While I'm sympathetic to that view, this assumption is easily rejected by people who don't view reality as fundamentally classical or local. So for these people, there is no challenge to meet.
Thirdly, while you can speculate that Shor's algorithm will scale to factoring numbers so large they require more atoms than are in the universe, no one has demonstrated that this is the case. Just because our current models describe this happening, that doesn't mean the model corresponds to what will actually happen in reality. It could easily be the case that the model is not accounting for noise or other factors that will prevent entanglement from scaling to the levels you describe. This is the position of some determinists, in which case Deutsch's challenge is also neutered.
Finally, other interpretations of QM can also provide explanations for speedups. For instance, any interpretation of QM that accepts its non-locality has an escape hatch via relativity: non-locality is effectively time travel in GR, but in a form that cannot be exploited for superluminal signalling. There are many other possible answers given by other interpretations of QM.
I personally think it's an interesting question, but it's not a compelling argument for many worlds, not least because the "many worlds as parallel computations" doesn't actually work beyond trivial examples.
> Computation is real - it requires matter and energy.
Yes, classical bits require a certain amount of matter and energy, but qubits do not have the same matter and energy requirements, which seems to be what you're expecting. If you expect there to be an answer of this sort, then I think you must give up believing that quantum computation will scale. Basically, you are expecting reality to actually be classical, and so have some deterministic classical computation happening behind the scenes (hidden variables), and these hidden variables will more than likely disrupt scaling quantum computations.