> That copy makes any reasonable user more aggrieved by the situation.
It's interesting, I find myself feeling this way about automated messages expressing (simulating?) emotion, too. I get that we want to "humanize" some of these interactions, but when Amazon shows me an automated message that says "We're sorry! Your package is running late" it makes me vaguely annoyed because deep down I know that there are zero human beings anywhere that are feeling the slightest bit sorry that my package is running late. As a result, it feels dishonest and disingenuous.
Sure, but again, I think this is confusing the catalyst with the cause. What bothers you is that a giant automated corporate machination is telling you that someone cares about this problem when that's not true. The problem isn't the friendliness of their design-- it's that they're lying to you. If they said "your late package is a giant problem for our scheduling audit algorithms which automatically deployed significant resources to remedy the situation so there's no need to activate the grievance process in our customer service labyrinth" they'd still be lying to you and you probably wouldn't feel any better about it.
Honestly though, most users are not software developers and their interactions with software, both logically and emotionally, are quite different. Most of them would rather have an automated "I'm sorry" because it fits their expectations for service protocol. They also probably prefer a barista impersonally saying "Sorry, it's just going to be another minute" while making eye contact and noting your acknowledgement to their dryly saying "your coffee is late because you and other people in front of you had complicated coffee drinks" and walking away. Eveyrbody knows that the barista has no emotional stake in your coffee being two minutes late beyond it putting them in the weeds. That's not the point. It communicates "I respect you enough to acknowledge your existence and that your inconvenience is consequential" which is actually true in most cases. Outside of the tech business, few people realize the extent to which that is false at Amazon.
It's interesting, I find myself feeling this way about automated messages expressing (simulating?) emotion, too. I get that we want to "humanize" some of these interactions, but when Amazon shows me an automated message that says "We're sorry! Your package is running late" it makes me vaguely annoyed because deep down I know that there are zero human beings anywhere that are feeling the slightest bit sorry that my package is running late. As a result, it feels dishonest and disingenuous.