But OO does suck, not because it is fundamentally different from the lambda calculus, but because it puts a kludgy set of handles that makes it easy to make bad decisions in front of a good, sound computational core. Of course, just like any advanced tool, an advanced practitioner can use traditional OO languages well, but the paradigm makes it just as easy for noobies to fall into pitfalls as does classical imperative programming, if not more so.
At the end of the day, all languages that are turing complete are the same language and the only differences lie in the kind of front end we provide. Unfortunately, we basically still have a single frontend, called C and every subsequent programming language has essentially just taken the C frontend and restricted or expanded it in small ways. We still ultimately work in terms of records, contiguous arrays and pointers. You can think of basically any more "advanced" construct in terms of pointers and it will make perfect sense nearly every time.
At the end of the day, all languages that are turing complete are the same language and the only differences lie in the kind of front end we provide. Unfortunately, we basically still have a single frontend, called C and every subsequent programming language has essentially just taken the C frontend and restricted or expanded it in small ways. We still ultimately work in terms of records, contiguous arrays and pointers. You can think of basically any more "advanced" construct in terms of pointers and it will make perfect sense nearly every time.