Not here, not as phrased. They're asking a question about physical reality; the answer there is that there fundamentally is no difference. Information and computation are independent of the medium, by the very definition of the concept.
There is a valid practical difference, which you present pretty much perfectly here. It's a conflict of interest. If we can construct a consciousness in silico (or arguably in any other medium, including meat - the important part is it being wrought into existence with more intent behind it than it being a side effect of sex), we will have moral obligations towards it (which can be roughly summarized as recognizing AI as a person, with all moral consequences that follow).
Which is going to be very uncomfortable for us, as the AI is by definition not a human being made by natural process human beings are made, so we're bound to end up in conflict over needs, desires, resources, morality, etc.
My favorite way I've seen this put into words: imagine we construct a sentient AGI in silico, and one day decide to grant it personhood, and with it, voting rights. Because of the nature of digital medium, that AGI can reproduce near-instantly and effortlessly. And so it does, and suddenly we wake up realizing there's a trillion copies of that AGI in the cloud, each one morally and legally an individual person - meaning, the AGIs as a group now outvote humans 100:1. So when those AGIs collectively decide that, say, education and healthcare for humans is using up resources that could be better spent on making paperclips, they're gonna get their paperclips.
Yes, it's almost a perfect conflict of interest. Luckily that's fine, because we're us!