Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I worked with a guy (I won't name the company) who wrote Java code in one, huge, static class as much as possible. In fact, everything was largely in one function too.

He decided to name his fields alphabetically.

static int a

static int b

static String c

static float d

static int e...

What, I wondered, would happen when he ran out of letters? Scrolling down further I saw this:

static int aa

static float ab

static String ac

static int ad...



There was one networks class I took where the assignment was to implement a simple network protocol to do file transfers over a serial port. Computers in the lab were paired up and had their serial ports connected to one another. People were assigned to computers and given either the receiver or transmitter to implement.

I was about done implementing the first draft of my side and asked the other side how it was going so we could test some actual communication. The response I got was "it's about done, we just need to split it up into functions". I was initially shocked and then naively impressed that someone could actually reason about the problem without breaking it down.

The end result was of course that I just had to give up and implement both sides of the communication. This was eventually a much better learning experience. I ended up abstracting out the serial port and allowing the two sides to communicate through a unix pipe with random bit errors introduced in packets to test the recovery. I could then run much longer testing without depending on the lab or someone else. I think I eventually tested it enough that I was up against the fundamental problem that the cheap checksum we were using let errors pass way too easily.


That exact same thing happened to me, except that I was using a smaller error rate, and a Hamming(7,4) code for error correction.

After the ordeal was over, I looked at the code that the other guys had been writing. They hadn't started on the error detection and correction -- I'd heard much wailing and gnashing of teeth earlier about how mathematical it was -- and their code (all in one big main function, with no indentation) wouldn't compile. I watched as they spent about an hour randomly permuting it, to no avail.

I'm not sneering at these guys. I'm baffled by them.


To really bork up the Java, you need a pattern fanatic. Once you stand working with Handler Adapter Handlers you know you should have taken the other colour pill.


When you have factories making factories, it's time to hoist the flag, get out the knives and start slitting throats.

-- after HL Menkin


Yes, i remember working for a company (java devs) where at some point we ended up having wrappers around wrappers around wrappers delegating stuff around, factories of factories ... It made your head spin.


I mean this dead seriously: People complain about the abstractions like "Monad" in Haskell, but I'm yet to see anything as abstract and difficult to reason about as a decorator around a facade delegating to an implementation of a factory factory of something probably producing a concrete instance of some other pattern monstrosity.


Definitely, but a point should be made that design patterns themselves are not a bad thing, quite the contrary, actually. It's overuse of design patterns that should be a crime. I came across some pretty difficult to understand abstractions, but fortunately nothing like the monstrosity that you described.


Yes. APIDelegatorInvocationHandler and APIAbstractFactoryFactoryProvider. (actual Java class names)


When you said 'actual Java class name' I thought for a moment that those were actually classes in one of Java's libraries. I googled the names, though, and fortunately, they're not...



That stuff has its place. But it's needed less commonly than it's used.


Often smart people coming from mathematics start programming this way. In math it's traditional to give everything what programmers would consider cryptic one-letter names, partially because you tend to write them on paper a lot while thinking, and partially because you have fewer entities so it's less bothersome to remember their meaning.

Of course it's a bad idea while programming, I just find it helpful to remember the reasons that smart people can make seemingly terribly unaesthetic decisions.


Sounds like he got his start programming spreadsheets.


He's not a person, he's a compiler! I know single static assignment when I see it.


I work with a guy, long time ago, who would take code already split into functions and refactor it into one big function. I am not joking here, actually happened.


I've seen that too. I don't think it was intentional -- the programmer simply didn't understand what the existing code structure was for, and "defactored" it.


Just think of all the function-call overhead he saved!


There are actually systems where this is a non-trivial gain. I've heard some of the Extream software (since acquired by HP) guys tell stories about having to inter-operate with ancient mainframes their customers refused to replace. In order to get throughput on some of the logic up to an acceptable level, they had to do just this. Amdahl's Law, a-gogo!


Happened to me in a college course - fortunately I haven't seen it since. However the CURRENT codebase I'm working on is a whole other set of nightmares.

Bonus points for the fact that they don't let me fix it...


That was actually standard Fortran practice, back in the day when variables were case-insensitive and could only have four characters. You haven't lived until you've ported some of the Fortran code aerospace companies have been running since the '50's.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: