Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Sigh. I can't say anything about Angular because I've never used it. But this constant ridicule of "the Enterprise" is getting on my nerves. Obviously, software design patterns change a lot over the years. Old practices are abandoned -- sometimes more slowly than we'd wish -- and others replace them. But much of this criticism comes from the fact that the critics 1) use technologies that are not mature enough to have growing pains, and 2) don't really know how software is made in the real world.

RE 1, the factory/provider/whatever pattern, comes in real handy when 2 years from now, a crucial ODE library that your air-defense system uses, absolutely has to be replaced by something else. And it's not like a kid could write a quick re-implementation in Go over the weekend.

As for 2, many of these patterns have been designed for software that does not resemble web applications at all. Just to give a sense of how out-of-touch SV developers can be with regards to the software world at large, IBM, Oracle and SAP combined employ more people than Google, Facebook, Twitter and LinkedIn combined, ten times over; there are more Java developers in the world than the entire population of the entire Bay Area.

I'm not saying that the maligned "enterprise" patterns have a place in a client web framework, or that some of them aren't dated. All I'm saying is, stop treating some software patterns that you think are unnecessary just because all you build are CRUD web-apps as inherently stupid. They are not. One of the reasons people working on CRUD web-apps can "move fast and break things", and just "rewrite the whole thing in a couple of weeks", and say stuff like "what's wrong with a simple 'new'?" is precisely because the software they develop is, frankly, not that complicated.

I would also like to remind the author that his beloved Erlang was developed in the very same environment he thinks so little of.



Engineer and architect of large enterprise products here (2 million line Java/C# behemoths that do all sorts of weird and complicated financial shit).

Reality is actually as follows. Not joking I've done this job for 15 years and worked with several large companies including one you mention. Perhaps your experience is in the 1% of "enterprise" companies who have clue but this by far is the majority:

Bold statement here: 99% of the use cases of all these patterns are totally pointless and a waste of money and time. The add nothing to the product, they increase complexity, decrease performance. It's cheaper and more reliable to chop your product up and write each chunk in whatever separately with no official architectural pattern usage system-wide.

Typically in the real world, you're going to end up with:

1. Literally tonnes of DI/IoC code and configuration for an application with an entirely static configuration. This is either in XML or DSL form using builder patterns. Consumes 30 seconds or more to start the application up every time. Aggregate over 50 developers is 16 hours a day pissed out of the window.

2. Proxies for anaemic domain models that are fully exposed. Consumes 30 seconds to generate proxies that do sod all. Aggregate over 50 developers is 16 hours a day pissed out of the window.

3. Leaky broken abstractions everywhere actually destroying the entire point of the abstractions. Makes refactoring impossible and maintenance hell. It's better in some cases that they are not even there and that basic savvy is used over COTS patterns.

4. Acres of copy pasta. Why bother to write a generic implementation when you can copy the interface 50 times and change the types by hand?

5. Patternitis. So we need to use the BUILDER pattern to write the query to connect to the DOMAIN REPOSITORY for the AGGREGATE to call the CODE GENERATOR that fires up the ADAPTER to generate the SQL using the VISITOR PATTERN which farts out some SQL to the database PROVIDER (and a 90 layer stack dump when it shits itself). This is inevitably used in one small corner while everyone in the rest of the system hits a wall in the abstraction and says "fuck it" and writes SQL straight up and skips the layers of pain and because there are no code reviews (because people are all sitting there waiting for their containers to start up whilst posting cats on Facebook).

LINE 10 (remember this)

These things are never rewritten or refactored. They slowly evolve into a behemoth ball of mud which collapses under its own weight to the detriment of customers, a team of enterprise architects (usually from ThoughtWorks etc) usually appear at this point then attempt to sell outsourcing services who will "fix all the shit" for a tiny fee, leave a 200 page powerpoint (with black pages just to fuck your printer up). The company struggles on for a few years and is saved at the last minute by a buy out by a company at an early stage of the cycle who slowly port all the customers and buggy shit to their product platform. Then the team either dissolve and knowledge is lost or they take a cash sum from the sale and start up some ball of shit company that does the same again.

GOTO 10.

That's enterprise 101 because the people that have been hit by the clue sticks know better than to subject themselves to this and do work in SF and SV and London. Me: I'm a masochistic crazy person who wonders every day why I don't just go and write web sites for half the cash or flip burgers because it's less painful.


I agree with you for the most part - what you're describing is rampant abuse by cargo-code programmers that don't know better.

This doesn't mean GoF design patterns are specifically wrong. It's just that they are misused and applied in places they don't belong. When done properly, they should emerge organically as you code without you even realising it (That is to say good design patterns are emergent phenomena in good code, not the other way around).

It also doesn't help that the entire software industry is fixated on single-inheritance OO languages, which tend to encourage ridiculously bad designs.

> It's cheaper and more reliable to chop your product up and write each chunk in whatever separately with no official architectural pattern usage system-wide.

This nails good software design on the head. Don't write large software, write small independent reusable components and combine them together.

This is what dependency injection solves well - When done properly. (If it involves XML or config files it's not done properly).


I think, as Peter Norvig once pointed out, that design patterns are a symptom of an anemic language design. They cost us time and complexity for their benefits. Java wouldn't need half the patterns typically employed to work around the limitation that there are no first-class functions.

Eich himself admitted to avoiding adding classes to Javascript in his interview with Peter Siebel for Coders at Work:

  SIEBEL: So you wanted to be like Java, but not too much.

  EICH: Not too much. If I put classes in, I'd be in big trouble. Not that I really had time to, but that would've been a no-no.
It's a really good interview and I recommend the book. It seems like Javascript was supposed to be an Algol-syntax over a non-pure Scheme inspired core... but due to constraints was thrown together like most code is when there's a looming deadline.


My point was mostly a sarcastic insight into what it's like for these companies but you're right: there are valid uses and the fundamentals really are spot on. In fact I'm probably #1 fan of Martin Fowler who has spent many years researching, collating and understanding these patterns so they can be communicated to others effectively.

The problem is that it takes literally a decade to actually understand how to use these properly in one specific language (and possibly mostly effectively in another). Until you reach this point, your toolbox isn't filled with the sharpest tools as your team isn't going to be filled with 10 year+ senior staff across the board.

At that point, it goes to shit, every time. These patterns and techniques aren't just tools - they require experience right from the get go and a lot of people just aren't capable of nailing it even in a decade.


When done extremely poorly - enterprise projects can indeed end up as slow behemonths. I agree that this is often the case, but it doesn't mean that the design patterns backing powerful enterprise solutions are to be taken lightly.

Adapter pattern does have a huge, proxies do have legitimate uses, visitor pattern can be made to be extremely powerful. Ofcourse if you have a development sweatshop throwing out these left and right - yes then you will have an abstract mud ball that is impossible to maintain - and in the end defeats the original purpose of these patterns.


> ... totally pointless and a waste of money and time

Sure. Most, but far from all. But let's look at the bigger picture. In the move-fast/break-things corner of the world they write simple software with basic functionality but do it at tremendous speed and sublime efficiency.

And then they go out of business.

So I guess most of lines of code written anywhere are a waste of time and money. But that's just how it is. A more interesting question to ask is how much money is wasted and how much is made by both kinds of software development. I don't know the answer, but it's certainly not clear cut.


This is golden


Thank you! I came here to say this, but you said it a lot more politely than I would have.

There's a lot of hate for certain design patterns because they are seen to introduce too much complexity or boilerplate in small projects - but are absolutely essential for large projects.


I hear this used a lot to defend Java and other enterprise nonsense but I have never seen an example of it in real life. 99% of the problems I see in large projects are creations of their own poor coding and design. There is a magical belief that using certain abstractions will fix this, but it seems more likely they are part of the problem.


You've never found the need to decouple components?


> the factory/provider/whatever pattern, comes in real handy when 2 years from now, a crucial ODE library that your air-defense system uses, absolutely has to be replaced by something else.

Hmm; that's the point where your enterprise experience and mine diverge. Here are what I consider the key observations from the organisations I've seen:

The majority of uses of dependency injection seem to be an attempt by a developer to not commit to a technology choice, or to build and preserve some feeling of control over the selection of platforms and technologies. Often I think this is a result of technology platform choices being taken away from the developer, typically by some kind of architecture function. So we abstract away from databases, ORMs, XML binding libraries, network transports, messaging systems, basically anything that's not part of the core application. See Java Enterprise Edition deployment descriptors for an industrialised example.

At the same time, in any sufficiently complex enterprise environment, the chances of any of the platform technology choices changing rapidly is tiny because of the basic risk avoidance culture. And even if you do decide to try migrating to a new technology platform (e.g. database, app server, messaging platform etc), none of the existing deployed applications will migrate because the cost of retesting the application on the new platform is too high relative to the business benefit you can demonstrate.

It doesn't matter if you say "it's just a 20 line configuration change": the result of your configuration change is to switch from one implementation containing thousands of lines of code to another implementation containing thousands of lines of different code. Your ops team simply won't trust you to put that configuration change live without as much testing as when you put the original application into production. And if it's several months since the last production change, good luck getting hold of the people who can carry out the testing - they'll all be off doing other things.

So the developer is building abstractions to insulate from changes in technology that never happen, but at the same time the one kind of change that's guaranteed to come along over the lifetime of a deployed application is a change in business requirements. And this is typically the exact type of change that all the abstraction and configuration frameworks don't tackle, because it usually results in changes in the way that components interact with each other, not just substituting one interchangeable part for another.

So, who can gain from abstracting away from the technology platform? Well the main benefit comes when you want the same code to run in many different environments. So the main uses in descending order of likelihood:

* You might sell the same piece of software into multiple customers and want to support the variation in environments you'll find. Hello, IBM, Oracle and SAP, as you point out.

* You might have different configurations of your software for different environments (development, test, staging, production and so on), but you need to be careful to make sure your test results from one configuration are actually valid for another.

* You might actually need to run the same code in production in more than one environment. There are cases where this happens (e.g. in a desktop app vs. a web app, or a mobile app), but often its just as easy to build an API around the code and invoke it over the network.

So, in my experience the effort of implementing all this configurable decoupling is very rarely rewarded by a significant reduction in downstream effort.


> that's the point where your enterprise experience and mine diverge.

Yes, that's precisely the point: there are vastly different experiences in such a big world. Just like most books are crap and most movies are crap doesn't mean that good books aren't written and good movies aren't being made. Most software -- in any language and any environment -- is crap. But that doesn't mean you can discount off-hand solid principles that are sometimes absolutely necessary. SV web startups write software that is order-of-magnitude simpler than a lot of enterprise software. Most of these patterns are downright wrong for move-fast-break-things software; but much of the software world is nothing like that. The arrogance expressed in the blog post is completely unjustified, even if most of what you see is crap.

Erlang and Clojure were born in the minds of enterprise programmers; as were Java and probably Go (oh, and Watson!). OTOH, the web-software world has given us Rails and Node.js. So if you compare the best products of both worlds, I don't think enterprise programmers come out behind. This condescension from web developers towards enterprise developers is not only misinformed; it is blatantly misplaced.


You shouldn't take offense. If you work on large complicated Java/C# projects, by all means, bake in as many GoF patterns you can muster. Part of what the OP is hinting at is that in a lot of other languages, there is simply no place for these complicated patterns, but yet we see 'Enterprise' developers bringing their bad habits and bloat with them.


> IBM, Oracle and SAP combined employ more people than Google, Facebook, Twitter and LinkedIn combined, ten times over

There's a reason for that.


Yes, and it's a simple one: web software is but a minuscule portion of software in general.


I would just like to say that thank God Joe Armstrong was building Erlang instead of James Gosling.

At the same I would like to say that I am sorry that Joe Armstrong didn't design Java instead of James Gosling.

I am also extremely sorry that Rob Pike, Ken Thompson and Robert Griesemer didn't get around to building Golang before there was Java...


Well, I like Go, and it certainly has its use cases. But I personally worked on a hard-realtime safety critical mission Java project (if the program gave the wrong answer people could die, and if it gave the right answer late, tens of millions of dollars would be lost). We couldn't have done that without Java's concurrent data-structures (which Go doesn't have), without Real-Time Java's hard realtime guarantees, and it would have been very cumbersome without Java's brand of dynamic linking.

So sometimes Go is good enough, but sometimes you need the full power of the JVM.


They built Limbo just about the time Java got out. Limbo is a predecessor of Go and it's by far the language Go is closest to. Unfortunately Limbo (and Inferno) didn't have the success they hoped. It's really a shame.


And yet, Go (and other modern languages) wouldn't look the way they did if it wasn't for Java.


That is the part of the problem as per original article, isn't it?

With all seriousness... I would like to agree with your statement. But I can't shake that nasty feeling of the "to save this village we have to burn it" undertone from your statement.

Edit: JavaScript got created without major influence of Java (it had more impact on marketing of JS than the language itself). Also I firmly believe that there was a lot of criticism of the then newly baked Java language that turned out to be prescient and to the point e.g.:[1]. Thus I would argue that as much as existence of Java has boosted development of virtual machines and compiler technology it has impeded development of language syntax and semantics.

Erlang is waaay older than Java, so are Python and Ruby. Smalltalk was the real game changer, Java not so much.

[1]: http://www.jwz.org/doc/java.html


With regards to JWZ article. It is interesting how perspectives are changing in time. JWZ wrote:

" - Java-the-language is, overall, a very good thing, and works well.

- Java-the-class-library is mostly passable.

- Java-the-virtual-machine is an interesting research project, a nice proof of concept, and is basically usable for a certain class of problems (those problems where speed isn't all that important: basically, those tasks where you could get away with using Perl instead of C.)

- Java-the-security-model is another interesting research project, but it only barely works right now. In a few years, maybe they'll have it figured out or replaced."

Years later it seems that both Java security model and JVM were actually good ideas, while the language itself is considered as too rigid and too verbose.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: