Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Why I am Not a Professor, or The Decline and Fall of the British University (lambdassociates.org)
187 points by jackfoxy on Dec 21, 2011 | hide | past | favorite | 107 comments


I find threads like these funny, people complain about not enough software engineering such as not knowing design patterns or the details of a java implementation. On the other hand a different group complain about the lack of of 'real' cs, which is mostly math.

Two opposing views, but they think the argument is both in their favour.

There is also a massive backlash against universities in the UK at the moment. I am seeing many 'apprentice' software developers, and freelance developers skipping university. They tell me they know more than the average university graduate. So I ask them about complexity? Nope. Predicate Logic? Nope. These are things taught in even the lowest polytechnic school and they don't know them. They don't know the bounds of their own ignorance because they think cs is just knowing a few programming languages, whipping up a few apps and websites. They know more languages the average graduate? Maybe. Can they do software engineering? The easy business stuff. Do they know cs? No. I think they have never been exposed to cs and don't how deep it goes.

I've seen the accredited software developer apprenticeship curriculum and it's mostly java and design patterns.

I think the focus on apprenticeships at the moment is creating a generation of software developers who don't know what NP-Complete means.

Computer science is not whipping up applications with fancy design patterns, that's why you're better than them. Give them a few years and their cs background + Software engineering experience will begin to shine.


I attribute that confusion to the fact that the term "Computer Science" is used as a facade for both Computing Science (a branch of mathematics), and the skills necessary to create and deploy software systems (which most of the time doesn't even require "engineering" math).

It's like grouping "Industrial Design" and "Theoretical Physics" under the term "Physics", if "Industrial Design" also included "Product Managment".


That pair is a little far apart. How about maybe "Electronic Engineering", meaning both "Electronics" and "Electrical Engineering"?


There's also the fact that zero science actually happens in a Computer Science department. But since our society worships science, everything has to be science or it's no good! Even stuff that isn't science!


That is absolutely wrong. I am a PhD student in Computer Science, and I strongly consider what I do more science than engineering. Machine learning is all about making a hypothesis, implementing it, testing it, and observing the results. There is a pressure toward only publishing positive results (no one want to learn about an algorithm that doesn't work), but it's still very much a science.

The same kind of experiments happen in lots of other CS sub-fields: high performance computing, security, etc.


It's a science in the same way that math is a science. A "formal science" seems appropriate. My school considered it a natural science, which is just stupid.

Note that most computer "science" classes are stuff like learning java, learning UML, learning software engineering. This stuff is barely engineering, let alone science. It has more in common with a fine art.

Algos and data structures etc are arguably a form of math, depending on the class most likely applied math.

It's not a natural science, and what you're doing is similar to the way artists will use, say, a picture of a DNA molecule to create a cool painting. If they use scientific principles in their art, does that make it a science?

No - what you are describing is a "craft" - a very informal kind of engineering. Architecture is not a science either.


To be fair, you can apply the scientific method to almost anything, but that doesn't make it "science" per se.

For example: product design. You make hypothesis, create a prototype to test it, and then draw conclusions from that.

Maybe "science" depends on the methods, and not on the fields. Maybe everything can be Science if approached scientifically.


That depends.

In some countries, there are "Exact Sciences" (basically Math and all the derivatives like statistics and computing) and "Natural Sciences" (like physics, chemistry and biology).

But it's all semantics, the important thing (and the thing most people seem to agree upon) is that the status of CS as Science depends of the status of Mathematics as Science.


One could say that computer science is a formal science[1]

[1] http://en.wikipedia.org/wiki/Formal_science


Most majors with the word Science in them, aren't (political, social, etc.). I am really not sure how that happened.


Social sciences and political science are FAAAR more scientific than computer science.

Computer science makes stuff like psychology, sociology and anthropology look rigorous. They actually do experiments and case studies and use real world evidence you know!

CS papers have more in common with Literature than they do with the widely disparaged soft sciences like psychology.


> I ask them about complexity? Nope. Predicate Logic? Nope. These are things taught in even the lowest polytechnic school and they don't know them.

Depending on how the university works, students can cheat on homework then barely the exam (thanks to easy, formulaic questions that they can cram the night before), and still get decent marks. The university will compensate, with harder homework and an easier exam the next year.


Also depends on the University full-stop. The CompSci part of my course was canned in favour of business subjects; I had the rug pulled out from under me after the first subject.

(So I got Z-specs, automata and big-O notation. Umm. Thanks guys.)


Are those views really opposing? Do people who ask for more software engineering want it to replace hard CS or augment it?


I think you will find if you start adding design pattern modules or methodology modules. The people on the side of cs will moan about it being watered down.

If you don't add them, the people on the side of software engineering, will say it's not relevant to business needs.

If you try to compromise they both moan about university not being relevant...


Design and methodology can be learned, but are hard to teach.

And they are even harder to teach inside science or engineering schools.


I wonder if this trend will be contained to the UK looking forward. With MITx et al, in my opinion, this could be the beginning of an interesting trend. Technology has only begun to affect education.


I'm actually hoping some apprentice or self taught developers take the "Introduction to algorithm analysis" course from MITX if it's on offer. It's probably one most relevant cs style modules for software developers.


Actually, that's a good point. The UK government has setup their universities to be disrupted by higher quality courses that are not under their jurisdiction.


Maybe that's what they want? There are already the exorbetently expensive "computeach" programs and similar.


They don't know the bounds of their own ignorance

Other people never do, do they? It explains everything. You need look no further.


> An easier way is to water down the educational system to a lower standard and then peg the university income to the number of students accepted while reducing the funding per head. In that way universities are given the happy choice of losing money and enforcing redundancies or watering down their requirements.

This has also become true at many US universities.

You can't water down the requirements and maintain placement stats at the same level. Many companies will simply pull the plug on recruiting and hiring once they have a bad experience with lame recent grads.

The really sad thing: The universities take the student's money, and then leave the kid unemployed at the other end. All that student debt is not dischargable in bankruptcy.


I don't know how elite US universities were. In the UK very very few people went and pretty much guaranteed middle or upper class life. One of reasons we boosted universities in the UK was because USA was sending a lot to univeristy. However it ended the guarantee of middle or upper class life.

They were education path of the elite, but we let a few working class smart people in via the grammar schools(Elite state schools) to keep the majority happy. Grammar schools selected at 11. The other way to get in was private schools, if you had the money. My ex-school is now an ex-grammar school because selection was banned for state schools. The people who did not pass selection went on to vocational training at 11.

Note - Working class in the UK is roughly the same as lower middle class US.

The university system in UK is still one of the best, were small country and have many universities in the top 100. However it's gone down.


In the old days, you could study at a library, go to private tutorials, then take a bar exam or accounting exam to become a lawyer or accountant or doctor. I doubt even the public service required a degree, as long as you could pass their entrance exam. There were other requirements for the professions, (such as an apprenticeship), but a degree wasn't always needed. (note, I'm not certain of the details, that's just what some of my relatives have said about how people used to cope without degrees).

The more people get degrees, the more things they are needed for. The lower class were mostly freed by economic growth. In 1960, the UK GDP/capita was about $3 a day, which is about the point where people stop worrying about what they are going to eat, and worry about health, education, their career, and where they are going to eat. If they needed to study at privatized training centres, then pay for a professional board to assess them, they'd have done that too, but the government responded faster than the professional societies.


You're right, and it was the same in the other medieval institution he doesn't talk about-- the Church. Unlike today, you didn't need a Master of Divinity (M.Div) to be offered a job as a pastor or priest. You presented yourself-- and were usually sent-- to a bishop (or consistory or presbytery) as one prepared, usually with Latin and Greek, to "read for Holy Orders." Often, the student was taken into the home of the bishop (!) and ate all meals with his family while being tutored in Greek, Hebrew, biblical exegesis, and Church history. Before the advent of the theological seminary in the early half of the 19th century in America, all ministers were educated this way. It worked very very well up until recently.

I mention this because it parallels the University exactly. They changed the system, and educationally we are all the worse for it because the most important thing now is the credential rather than knowledge. To get the credential you have to go into extreme debt if you happen to be poor. If you don't believe that there's been inflation and the system has gone soft, just take a look at a McGuffey's Sixth Reader. Shakespeare and Dryden at that stage! Something very important has been lost, but most Anglo-Americans don't realise it.


See also this recently published book for more on the problem of publishing in academic journals that nobody reads: Planned Obsolescence: Publishing, Technology, and the Future of the Academy

A free version is online: http://mediacommons.futureofthebook.org/mcpress/plannedobsol...

Here are some quotes from an interview w/the author from http://www.insidehighered.com/news/2011/09/30/planned_obsole...

"Here are two ideas Fitzpatrick proposes to kill for good: Peer review is necessary to maintaining the credibility of scholarly research and the institutions that support it; and publishing activity in peer-reviewed journals is the best gauge of a junior professor’s contribution to knowledge in her field."

"Little in graduate school or on the tenure track inculcates helpfulness,” she writes, “and in fact much militates against it."

"But to the extent that individual academics continue in their lust for “power and prestige” by vying for exclusive spots in elite journals, they should not be surprised to find themselves as irrelevant and moribund — indeed, zombie-like — as print monographs have already become, warns Fitzpatrick."

“If we enjoy the privileges that obtain from upholding a closed system of discourse sufficiently that we’re unwilling to subject it to critical scrutiny, we may also need to accept the fact that the mainstream of public intellectual life will continue, in the main, to ignore our work,” she says. “Public funds will, in that case, be put to uses that seem more immediately pressing than our support.”


This aligns with my experience as a British student who received a first class degree from a fairly well-respected program.

The reality was that the teaching was uniformly mediocre, I remained pretty clueless about the subject material, and I produced so-so quality work. I should have worked harder for my own curiosity, but there was a complete lack of external motivation because the academic standards were so low.

To paraphrase someone's quote: I wouldn't recommend a club that had me as a member.


I think one of Groucho Marx's other quotes is probably more applicable to the sentiment of the article; "Those are my principles, and if you don't like them... well, I have others."


I'm in a similar situation. I graduated a few years ago from a good British university and got a first, but I frequently look back and wonder how I managed it.

I distinctly remember arriving and working hard for the first semester, I got 80-something percent in one course and 90+ in the other and had this awful realisation that I could glide through the first 2 years (of a 4 year program) where only a 50% passing grade was needed.

By the third year the courses did become much harder but only relatively. Whilst money doesn't seem the right way to restrict entry I do feel like they could have enforced higher entry standards, especially when I saw only ~30% of my fellow students make it to graduation.


Woody Allen


I thought it was Groucho Marx? https://en.wikiquote.org/wiki/Groucho_Marx

Although apparently the sentiment even predates his usage of it by about 60 years.


>To paraphrase someone's quote: I wouldn't recommend a club that had me as a member.

Well then, hypothetically, no institution you attend would be good enough.


That doesn't make sense to me. "Had" past-tense implied completing the duration of the degree in the manner described without receiving the slightest reprimand.


I was merely addressing the quotation.


I have experienced two degrees in two different universities in the UK. First was a standard undergraduate course finishing in 1992, a joint BSc in Economics and CS. Course content was fine (complexity, algorithms, compilers, database design etc).

However the students attitude was poor. As an undergrad straight from school I did not self learn, did not study - generally not motivated. Left in recession and with poor CS skills could not find work so delivered pizzas, worked the call centres, manned beach car park huts - all fun however.

Second degrees was a 1 year MSc in Software Engineering - this time focusing on the softer skills around methodologies, design, large systems modularity etc.

This time the student was motivated. I read, studied, wrote C++ and Java programs, delivered tasks on time and used my previous economics knowledge to build a dissertation on neural networks analysis of wholesale electricity prices. I left and went straight into a C/C++ job helping to build a mobile phone network planning system.

On both occasions the tools, environment and time were available. Just the student attitude differed. I saw what I wanted and went for it - just not the first time.


Not just in Britain. The grade "scaling" or inflation is rampant in the US, but not nearly as bad as the insane focus on research. The number one job of a professor should be to teach the students of the college.


Research is teaching; teaching the graduate students how to do research.

I agree with your sentiment that college profs should care about teaching, but to be honest too much is asked of them. They're expected to be excellent researchers, inexhaustible grant writers, engaging teachers, inspiring mentors, and part-time entrepreneurs. Those who can achieve 4 of the 5 are still impressive, and more often then not somethings got give.


In the US 'Research' generates money for a University. You apply for outside grants and then fund your students who are then able to pay the University the vast majority of that money. Depending on the grant you may even pay for lab space and equipment as well as a large chunk of a professors salary.


Yes, too much is asked of them. In my mind, the solution is to just ask less, specifically ask them to primarily be teachers. This does have implications for grad programs and I don't know what, but it would certainly clean up undergraduate education.


Then none of the people in top universities would be there. Talented researchers go to top-tier universities for their PhD because they want to work on research professionally. They do not wish to teach, they are forced to do it because academia is, for the most part, the only place left where someone can be a professional researcher. Although there are still corporate research labs around, the days of Shannon developing Information Theory at Bell Labs are long gone.

Quite frankly, I'm fine with this setup. If you want to have a good undergraduate education provided to you by the professors, go to a school where they have that as a priority. If you want to get involved in pushing the boundaries of knowledge, then go to a place like MIT or Stanford and get involved in an undergraduate research program.


I'd observe a couple things, which are much elaborated here: http://jseliger.com/2011/10/31/college-graduate-earning-and-...

1) Professors are responding to institutional and student incentives: institutions reward research, so professors prioritize research over teaching.

2) Students are, for the most part, choosing easier degrees; as a result, those degrees are prospering, and many non-easy disciplines are watering down their criteria to attract remaining students.

3) The only real impetus for change that I can see comes from two areas: a) the amorphous group of employers who want better outcomes—but they have very little leverage and b) graduates who are unhappy to discover they have a great deal of debt and few marketable skills. Again, however, they have little leverage.


In the US the tying of funding to results is accomplished by market forces (which also favour, say, campus facilities) whereas in Britain it's administered directly by the government (which also favours "research", that is, publication count).

But since both place a substantial weight on "graduation rate", the same effect is seen vis-à-vis teaching quality.


Two types of schools that I have seen in the U.S., diploma mills, and failure mills. I went to school in a fail as many as you can school. Almost every class I walked into was 50-70% fail by design. I learned a lot, mainly studying on my own. The focus on research vs. teaching was just as strong.


Professors will do or become whatever is required of them to achieve tenure. The number on job of a professor is therefore to meet the tenure requirements of the institution - which at most places, means "do lots of research". Change the reward system and you can change how things work.


"I've never worked out whether I was, in American terms, an assistant professor or an associate professor."

Generally speaking, British lecturers are equiv. to associate professors, and assistant professors are untenured.

Most post-docs in proper academic departments are assistant professors, but this is not a hard and fast rule; post-docs in tenure-track programs nearly always are assistant professors.


I graduated from one of the top 5 British universities in the mid 90s with a 2:1 degree in History, and I have to say that it was one of the easiest thing I ever did. I had class and lecture time of 4-6 hours a week, and spent another 8 hours reading and writing essays. The essay requirement, which contributed almost half to the degree grade, was only about 12 3,000 word essays a year. I had so much free time that I turned my part time job into a full time job, working 35 hours a week.

Of course, I should have worked harder, and I would have learned more if I had, but I was 19 and 20 years old and it was just so easy to get a good degree without working at all. I know I was far from alone in this.

Every degree course is not the same, and no doubt others may have worked long hours to earn theirs, but my own experience has left me with no respect for UK degrees, to the extent that when I read CVs from candidates I consider a third level degree irrelevant.


The variability in UK university courses is pretty extreme. I have three degrees: a bachelors in mathematics ("pass"), a bachelors in accountancy (2:1) and a masters in statistics ("pass", though I averaged a "merit" I missed out because I was inconsistent), all at different universities. While I couldn't even manage a lowly third for my mathematics undergrad I worked far harder for it than the other two, despite the fact that it was the only one I studied for full-time.

Before starting the masters in statistics I took some undergraduate (first/second year) maths/stats courses at the same university (while still studying for the accountancy degree and working 35 hrs/week) and the exams and assignments were probably easier than anything I saw in the first two weeks of my maths degree. Based on my performance on those courses and what I saw of the 3rd year undergrad courses (the masters students shared one with the undergrads, though we had a different exam) I would have cruised to a first without breaking a sweat, a pretty different experience to when I was struggling to pass my degree at all!

It does make it almost impossible to judge anybody's degree and grade unless you are already familiar with the course, the toughest courses are genuinely difficult but the easiest courses are a walk in the park for anyone of reasonable ability.


Enjoy the title until the inflated global perception of its value decreases to zero :)


If it was so easy, why didn't you get a First? Sounds like you chose low standards for yourself.


I addressed that above:

"Of course, I should have worked harder, and I would have learned more if I had..."

My point (which I thought I'd made clearly) is that it was far, far too easy to do well, with little effort.


My point is that by your own admission, a 2:1 requires little effort and therefore is not to be considered "doing well".


Or that his standards weren't the kind measured by a "First".


Two and a half years ago I made the decision not to go to University and so far I think it was one of the best decisions I've made. I have learnt a lot more through experience (I'm a freelance iOS developer). I have also learnt about life quicker. After living with some of my friends who went to University I quickly realised how easy they have it. A few hours of classes a week (depending on the course) and very little studying outside of class. They also have everything paid for them through student loans and grants. On the other had I had to work hard and ensure my business succeeded or I wouldn't make rent.

University in the UK has been something that is just 'what people do'. Most people coming out of A-levels wouldn't even consider not going to University, especially because High Schools push it so hard (as it makes them look good). There needs to be more education in High Schools about the option of not attending University.


Learning how to program in IoS is such a small part of software engineering, and probably does not even constitute computer science. Simply put, you don't know, what you don't know.


I don't mean that by learning to develop for iOS I have learnt what I could have learnt studying at University. I think that everything I have learnt in the last few years which I wouldn't have had I attended University has/will be vastly more important and useful.

I could be wrong, there are still many years ahead of me, but I think what I have gained by not attending University is much more than what I would have gained by attending it.


Don't mind the mean guy or the down votes.

If you want to learn fancy stuff, you can read books on your own time -- and watch video lectures from universities and other sources -- and learn it faster, more efficiently and more pleasantly than you would have by attending school.

He's right that there are various important things you don't know right now. But:

1) you might never need to know them, it really depends on your career trajectory, life priorities, etc...

2) if you had gone to school, there would still be plenty of gaps in your knowledge anyway

3) you can address gaps in your knowledge whenever you want without going back to school. Non-school learning is possible and effective.

4) Learning advanced CS topics -- and really understanding what they are for and how to use them and other useful stuff -- is a lot easier with some experience as a programmer like you're gaining.


I didn't mean to be cruel.

Self-Doubt is most likely going to be the driving force for him to get through the fancy stuff on his own. However I want to be known learning to program != cs.

Being confident in your own knowledge can often stops you from learning more. I often doubt my own knowledge, I often feel worried if I meet someone else because they expose me for the fraud I am. This keeps up the knowledge hunger.

I was young hackery type when I was a kid(I wasn't particularly academic either), trying various programming languages(C, Scheme, Haskell), building games(Even 3d), exploiting software with buffer overflows, maybe some malicious hacking(I was curious) and generally exploring computing. This made me over confident in my ability. Then the academic community completely showed me up, they showed me how little I know in terms of theoretical cs. Destroyed my confidence. They don't even respect the skills I had, they're not academic skills. This made me doubt myself, and to catch up on the academic side of cs. This taught me the lesson of being overconfident. Actually changed my attitude to approaching other computer guys too from "I'm the best", to "this guy might know more than me".

This made me impulse buy copies of don knuths books =P. My math skills also received a serious boost when I realised that was needed too.


What have you learned? Learning how to create software and learning computer science are vastly different.

In software you learn design patterns, good practise, basic data structures(So they know when use a linked list or an array in different situations) etc It's very relevant to business programming

In cs it's algorithm analysis, advanced data structures, AI, machine learning, mathematical logic etc

However I eventually found that cs becomes useful in designing software. It helps in choosing what data structure, or what algorithm. It gives an edge which you don't even know about unless you've been exposed to it. If I have to(Which isn't often) I can rigorously calculate the Big O growth rate of a function as opposed to just guessing(By looking at the loops). I suppose you could just benchmark...


Yes, learning some programming to be an iOS developer isn't enough, but that doesn't mean that guys like k-mcgrady can't learn on their own. Keep it up, man!

I know a programmer that knows so much of language and compiler theory he is always a source of knowledge and inspiration to me. He has a university degree in... accounting.


You're right, and I think you made the right choice. I'm beginning to regret my decision to go to university. The intellectual grade of my colleagues is shocking, a significant proportion can't even write a paragraph properly, and I'm constantly carrying all the slack on group projects.

I'm a final year student on an Information Systems degree, and one of the modules I'm currently taking, "Web Application Development", is nothing more than an introduction to HTML and CSS. In fact we don't even have to produce anything aside from a few snippets. I feel like I've wasted my time and been ripped off, if I'm ever in the position to hire I would never employ someone based on a degree alone.


That sounds pretty bad. Only time will tell which decision works out though. At the minute most employers won't even consider someone without a degree so it would be difficult for me to find a decent 9-5 job. University still works well for some courses too (e.g. law, medicine) but it needs to stop being presented as the only option. In schools now you are told to either pick a trade (plumber, electrician etc.) or go to University but there are other options.


I've often thought that a lot of software development as it is practiced in business is more like a traditional "trade" than not. It's about knowing how to use tools to build things.


That describes almost every engineering job, which is not a trade. Trades build, engineers/architect design which the trades then build.

Software can't be separated like that because the software is the design and implementation.


> At the minute most employers won't even consider someone without a degree

This is actually a good thing, it's and excellent flag of someone you wouldn't want to work for! I don't have a degree in anything - I dropped out of an engineering degree in the second year and joined the marines :-). I taught myself to program and I now work in a small team in a hedge fund. In my opinion experience trumps education anywhere that's worth working - if they won't even consider someone without a degree run the other way!


Good programmers are in demand in lots of places, University degree not required at plenty of places if you have something else to show them (like a history of writing working code).


> At the minute most employers won't even consider someone without a degree so it would be difficult for me to find a decent 9-5 job.

If that's really the case, even when applying direct, then it's messed up. In interviews I've been in or conducted, someone walking in with a computer and demoing some of their creations is vastly more persuading than I-went-to-X-university-and-my-CV-claims-I'm-expert-proficiency-in-every-subject-I-completed-a-module-on.


That "someone walking in" is pre-screened by HR in many places, and won't even get invited if they don't meet the basic criteria (educational background and experience).


Anyone who possesses anything resembling "hustle" will not rely on HR to establish relationships inside a company where they want to work.


This was my experience doing Computer Science at a British University, too.

Most of the modules which I took were so watered down that they were absolutely useless to me. I knew this and I was pretty depressed at the time. I'm not very good at doing mindless work: some of the lectures I just stopped going to and other times I completely ignored the vacuous assignments I was being asked to do.

Looking back I wish I had dropped out and gone straight into a job with the programming skills I was teaching myself. (But I guess if I had done this I might not have learnt about fundamental CS concepts.)

I actually love learning but like to do it in my own way, on my own accord. I'm thinking of taking those online Stanford classes which are starting soon -- I guess the only thing that is missing from these is human conversation. I wonder if one day people might informally meet for coffee to discuss the online courses they're taking together. ;)


If you want a longer, elaborate article, NY Books has a great article on the current problems plaguing higher education: http://www.nybooks.com/articles/archives/2011/nov/24/our-uni....


This gives me a better insight as to why my CS degree is a piece of shit and why I learn not much more than I already knew or that I use on a regular basis.

I'm Canadian not British, but I do relate to everything that was said in this article. I completed my degree in 2004.

Great read.


Most colleges do a poor job of providing CS education. Worse yet, CS is a poor substitute for the education needed to do software engineering.


We're finding candidates with a math, physics, or engineering background are better prepared to write software than CS graduates. I don't really understand why that should be the case.


When I did a CS degree in the 80s we did roughly the same amount of maths as engineering courses and all of the more difficult CS classes (generally the more mathematical/formal ones) were mandatory - there was a relatively small amount of choice and certainly no way to graduate without being a fairly decent developer and quite happy with formal abstractions.

Unfortunately, as the article describes, many CS courses have become "customer focused" so are now, as far as I can see, attempting to become vocational training courses, which universities are generally pretty awful at. When I finished my undergraduate course (with a 1st) the only thing I felt qualified to do was go into postgraduate research - which is pretty much what the course was oriented towards, although this was only apparent in retrospect.

"Real" CS is irrelevant to 98% of development jobs. In my opinion anyone believing that a CS degree will train them to be a good developer is going to get a nasty shock.

Having said that, some of the very best developers I have worked with did have the combination of awesome raw talent and CS degrees (often PhDs). Of course, I've also worked with some equally good developers (in their own way) who didn't have a degree.


In my opinion those majors are generally more challenging than CS and thus the graduates will be of a higher intellectual caliber. Additionally, folks pursuing technical careers with those degrees will be more likely to have a personal passion for technology. They also won't have the bad habits that CS programs often instill in graduates, they will look to the workplace and the industry standards as a guideline for their habits and behaviors rather than their college professors and fellow class-mates.


This is, in my opinion, entirely the government's fault.

The government decides how much to fund universities based on publication quality, which they rate based on the journals the papers are accepted into.

There is almost no benefit to teaching students better, and there are huge advantages to passing students who would otherwise fail. This is because universities have a strict limit on the number of students they can accept, and these are not replaced if students fail their first year.

So, to maximise income universities have to keep hold of students, while getting as many papers as possible into high quality journals.


I graduated in Computer Science in 2004 from a very well respected British University. I graduated with a first. There is usually a very clearly signposted path to getting a "good degree" without necessarily having to know all that much core computer science.

My course was a four year course. The total weighting for each year was 10-20-3-40 (years 1-4). The first two years had non-optional, core CS modules (algorithms, logic, discrete maths, etc.) and the final two years had a lot of electives. If you could muddle through the first two years, you could take a series of electives in the final two years (foreign languages, Accounting and finance, etc.) that were agruably much easier.

I got mediocre grades in the first two years, but good grades in the final two years, resulting in a first class degree overall.

I regret my choices, but as a lazy undergrad, I took the path of least resistance to achieve my target (a first class degree). I was not the only one who did this. The problem is that people like me made the University look good, so I think they made it very easy to game the System. The only thing I worked really hard on were the programming assignments and projects. The exams were easy to pass provided since they had a very clearly laid out pattern, and questions tended to be repeated year and after year. If you could solve exam questions from the last 3-4 years before your final exam, you would probably ace it.


You know, I'm getting pretty sick of all of these "doom and gloom" stories about the modern higher education system.

Yes, the modern higher education system is not ideal. But what, in life, really is? That's not to say that we shouldn't pursue a better system, but we shouldn't give up on a system just because it's not ideal.

And with all of these doom and gloom stories, I have yet to see anyone offer an alternative. Yeah, there's a lot of paperwork involved in being a professor, and you get evaluated on criteria that don't quite line up with the ideal for being a great professor. But what would be better? How can we create a better system? And if there's such an obvious better answer, why doesn't someone do it?

If there is some obviously better system, I'd love to see it. If such a thing exits, it should be quite competitive with the current higher education system. No one wants to hire incompetent new graduates. No one wants to be one. So we should see something better, something that indicates there is some better way of doing this.

Instead, we see a steady stream of technological progress. I can do things that no one could do before, like carrying a device around that allows me to pinpoint my precise location, stream maps down to me, find me directions to wherever I want to go, read those directions aloud in a synthesized voice, all for the price of of 2% of median yearly income (including hardware, software, and the service). And that, of course, is not to mention all of the other things that are available to me.

Now, maybe I'm living in a bubble, built by people who got a proper education before all of this grade inflation and other nonsense. But really, this article is complaining about the last 20 years. A large portion of the people who are doing work in technical fields finished school within the last 20 years. And yet, we're still seeing significant progress; we are still living in a world that is tumbling into the future at a high rate.

So I want to know two things. For one, why are we still progressing so quickly, despite these apparent problems? How are we managing to innovate, if our educational and academic foundation is so unsound?

And for another, what is the solution? What do you propose we do better? If it's so much obviously better, why don't we do it? Or why doesn't someone, somewhere do it, and show significantly better results?

I think one reason is that when you are doing work in the top few percent of human ability, you look around and realize how ordinary it is. Even being at the top, everyone has their flaws. No one is perfect. Systems designed to prevent people from cheating also prevent some people from doing amazing work. But overal, it isn't a few geniuses at the level of "Mozart" that we need; it's a lot of people, doing work at a high level, but not what some might consider "genius." If you are immersed in it seems somewhat boring, but when it all adds up, it winds up opening new possibilities that were simply not available 10, 20, or 30 years ago.


What's worse is that the article is stuffed full of hyperbolic crap about Maoism and misses some of the more valid criticisms of UK education system (the transformation of technical colleges offering effective vocational training for non-academically inclined people into degree awarding bodies which felt the need to adjust their course content to match; the objective of the previous administration to get 50% of school-leavers into university which inevitably resulted in the reintroduction of fees to pay for it, skewing the entrance pool) in favour of arguments that are dubious at best. In effect, he's blaming the government for flaws that sit squarely on the shoulders of the academics themselves.

If you believe the author, the worst thing that has happened to the education system is the requirement that lecturers actually produce academic output (due to the "envy" of the taxpayer subsidising them). As the author points out, some of this is less than seminal, but frankly I don't find the implicit argument that the same academics would miraculously produce more valuable contributions to the world if not shackled by the requirement they actually justify the money being thrown in their direction. I've read some decidedly mediocre papers written before academics were obliged to get things published on a regular basis.

Also he complains about modularisation, because apparently higher education students aren't smart enough to choose their own areas of specialization. Sorry, but if the University of Leeds' course on webdesign in the 1990s was too easy, it's because the academics running the course were slow to embrace and understand the potential complexities of the web, not the fault of the bleddy gubment. Just be glad they didn't make it a "core" subject that everyone gets high marks on.


If your talking about polytechnics being converted, they were never 'technical colleges', sounds like you've been reading too many tabloids.

Polytechnics could always award academic degrees(Up to phd) as-well, the main difference was they had their courses set by a central body. You could also study sub-degrees(Hnd) which are a bit like associate degrees in the US.


I've been hanging around in higher ed for around a decade. I recently got interviewed for an assistant prof position. Didn't get it, and the feedback I got as to why I didn't get it was frankly bizzare. The job itself would have been a really nasty one too. Kill you with teaching and expect a proper research output too. Glad I didn't get it. Don't get me wrong, if the university system decides that either they want me to do a nice job without all the performance management crap they usually force people to do (actually slightly likely in the near term in my case), or they decide that they need me more than I need them (long term possibility) then I'll be happy to have them. Otherwise, I might do a little bit of teaching to keep library access and go and do proper paid contract work instead for a while.

The higher ed system looks in quite serious trouble to me right now and I do think is heading towards a transformative crisis in the medium term.


> "But what would be better? How can we create a better system? And if there's such an obvious better answer, why doesn't someone do it?"

Here's a simple answer: The better system exists, but people overlook it.

As a student, first get a Bachelor's degree.

Second, pick a field and be sure have learned it well at at least the Bachelor's level. Do this learning independently if necessary. A Bachelor's degree is supposed to teach you how to do at least this much learning independently.

Third, from that learning about the field you selected, learn some more, to 'the next level'. Likely do this independently.

Fourth, show up at any one of the better research universities and take the Ph.D. qualifying exams based on what you learned.

Fifth, stick around that university and attend some seminars and courses that are introductions to research given by experts in research. Here your work is largely independent.

Sixth, pick a research problem and get some good results, independently or nearly so. If there is any doubt about the significance of your research, then publish it.

Seventh, submit your research as your Ph.D. dissertation.

Congratulations: You are now out of school; you went all the way; you are educated. Done.


Hmm. I would love to follow this course of action. Can you point to evidence that schools will allow you to take their qualifying exams and let you get a PhD based on your independent research without being formally enrolled?

Also, as far as I can tell, most of the article is complaining about the bachelor's level, which you assume as a given. For those who haven't done that, do you have an alternative for that level?


As a math professor: why would you not formally enroll?

We have formal admissions procedures and the like, but this is not to screen out by arbitrary criteria. Grad applications are screened by math professors, not some stuffed suits somewhere who can't do trigonometry, and if you are well prepared for grad study then it will show and you will be admitted. And, typically, funded with a stipend (usually there are 10-20 hours of teaching a week you have to do, depending on institution).

I think most professors would be happy to let you sit in on an advanced class without enrolling. But doing a whole Ph.D. that way? Perhaps it is possible, but I can't imagine why anyone would, and I don't know of anyone that has.


The question was, could a student without a Bachelor's just show up at a grad school, offering to take the Ph.D. qualifying exams and believing that they are well prepared, AND, without a Bachelor's degree, be permitted to do so? In particular, to take the qualifying exams, would they have to be 'enrolled' and would that be possible without a Bachelor's degree? And, although not said, maybe the student needs financial aid and hopes to get it based on their good qualifying exam performance.

So, they are ready to take the exams. But they have no Bachelor's, are not enrolled, without a Bachelor's would likely not be accepted or enrolled, and need financial aid. So, can they take the quals? If so, then how? That was the question. I suggested maybe an Associates degree, some really good GRE scores, and offering to take the quals BEFORE applying for admission.


I didn't cover getting "enrolled", but usually that won't be much of a problem: Even the top graduate programs are hungry for good students. If you are ready for their qualifying exams when you show up, then SAY SO. And/or submit a stack of nicely done, hard exercises from a well known, challenging text. Then being enrolled and able to take the exams should be routine.

For an alternative to the Bachelor's, mostly graduate schools will expect a student to have that degree. But, a graduate school doesn't have to care very much about where a student got a Bachelor's. That is, nearly everyone in serious academics knows that a student who shows up ready for the qualifying exams deserves the credit themselves, that is, the school didn't deserve the credit. The US is awash in relatively inexpensive Bachelor's programs, coast to coast, border to border.

There is a hidden point going for a graduate student who can do research: At the top schools, especially Harvard, Princeton, Johns Hopkins, the emphasis in graduate school, for both the profs and the students, is just RESEARCH. In particular, commonly there is no coursework requirement. Princeton has been known to state on their Web site that grad students are expected to prepare for the qualifying exams on their own, that the courses given are for introductions to research, and that there are no courses for preparation for the quals. So, this stuff about independent work to get ready for the quals is mostly what have to do anyway. Some schools publish copies of their old qualifying exams and have lists of references for preparation for the quals. So if you are good with the references and the old exams, then you have a good shot.

All those things said, ugrad school is usually more strict about courses, credits, and grades. Still, there may be some full or partial ways around the time and expense of a full ugrad program: First, it may be possible to impress a grad school just with GRE scores. So, if you are well prepared, then see if you can take the GRE without a Bachelor's. Second, the US is also just awash in junior or community colleges. Since their student quality is usually low, they would be thrilled to have a good student with a shot at getting a Ph.D. These colleges are usually just for two years and give only an Associates degree. These colleges are uaually quite inexpensive. But, then, the last two years of a four year program are mostly just in the major subject, anyway. So, a grad school MIGHT be willing to accept just an Associates degree. Third, there is a fairly strong but rarely written principle in US higher education: Really good students are wanted and go for free or nearly so! If you are ready for Ph.D. quals, then the money for a Bachelor's should be no big problem, even at Harvard. And once you get at a really good school, say, Harvard, you mostly won't be held back. E.g., I knew a bright student who at Harvard as a sophmore took a reading course with A. Gleason, right, who solved one of Hilbert's problems. And Gleason never got his Ph.D.: His work was so good Harvard made him a Harvard Fellow and put him on the faculty. James Simons was an ugrad at MIT and in his senior year took a reading course from Singer, right, as in the Atiyah-Singer index theorem -- watch the video of Simon's talk on YouTube or some such. Simons? Right, he is as in the Chern-Simons result (try Wikipedia), was Chair at Stony Brook, started a hedge fund, for some years paid himself $2 billion a year, and, net, is likely the most successful hedge fund guy.

Yes, the field where this independent study works best is math. In physics, chemistry, or biology, a ugrad program will typically have a lot of lab work can't do independently and some lecture courses with material difficult to get independently. But math remains the most powerful major, maybe even for physics and biology. Even biology? Look up Eric Lander! But, again, for a really good student, ugrad can be for free or nearly so.


You mentioned the GRE, and I assume you mean the famous GRE general test. But another option to look at to prove mastery of material is the GRE subject test. While the GRE General is mostly an IQ test, GRE subject tests have a knowledge component to them. And, they are quite challenging: for the GRE CS test less than 1% of the test takers in the past 3 years achieved score above 900 (maximum score possible is 990).

EDIT: anyone interested can download the test booklet here which has an example test:

http://www.ets.org/gre/subject/about/content/computer_scienc...


I meant both tests but definitely the subject tests.

Usually the CEEB, SAT, and GRE tests are designed to have Gaussian distribution with mean 500 and standard deviation 100. So, can get a table of the cumulative Gaussian distribution and see what percentile 4 standard deviations above the mean is; I would guess less than 1%.

I got 800 on the GRE subject math test, the only 800 I got on any of those tests, and that 800 always intimidated my wife, MUCH smarter than I am, PBK, etc.

Why especially the subject tests? Because the question was how to skip a Bachelor's degree and do not pass GO, do not collect $100, and go directly to grad school. There the grad school may still want a Bachelor's but for anything less really good scores on the relevant GRE subject tests may be the difference. Show up with 750 or better on GRE subject math, physics, and computer science, and offer to take the Ph.D. quals right away, and may, just MAY, be permitted to 'enroll'. Blow away the quals, publish a paper or two, even in a conference, maybe knock out some code just to prove are not all theory, and may be regarded as a good student. Then will be closer to the front of the line for various kinds of financial aid.

How to skip a Bachelor's is chancy. For the importance and potential of good independent work, at the best research universities that's rock solid. Read the story of the guy who gave the name a 'good' algorithm, Jack Edmonds. Read what Feynman did at Princeton. Read what Gleason did at Harvard. Independent work was just crucial in how I got my Ph.D.: It helped that I did the research for my dissertation independently in my first summer and worte a 50 page manuscript. Then it helped that took a 'reading course', selected a long outstanding problem, and found a solution which also solved a problem in a famous paper by Arrow, Hurwicz, and Uzawa (poor Uzawa was left out of the prize). It helped that the department chair taught a flunk out course, an advanced, second, course in linear algebra and I took it as my first course in linear algebra and blew everyone else away. How? I'd done a LOT in linear algebra independently and in my career in 'scientific programming' before grad school. It even helped that I was the only student that year who showed that there are no countably infinite sigma algebras. That's the kind of stuff that can help one get through grad school.


Skipping the Bachelor entirely might be one use case for the GRE subject test, but I think a more common use case is where the applicant to grad school has a Bachelor is a different field (Math grad going to CS school). I imagine one wouldn't need blow out grades in that case.


Also, I don't think what you're saying about the Gaussian distribution is true for GRE Subject tests as they have significantly varying distributions. Check out this table:

http://www.ets.org/s/gre/pdf/gre_guide_table2.pdf

It would appear that GRE Physics is the easiest test, while Biochemistry is the hardest.


Yup, we both might be correct!

I had a 'qualification': "Usually the CEEB, SAT, and GRE", and with that we both can be right!

For those tests, at the level of detail of the distribution of the scores, it can be tough to get solid data.

But it is easy and common in educational testing to scale 'raw' scores so that the 'scaled scores' are Gaussian.

Also in educational testing, it is easy and common to have enough data on individual questions so that the distribution will be known fairly accurately for a test made of such questions -- my father did that for years as the main 'educational architect' at one of the world's largest and most important technical schools, with 40,000 students there at any one time.

Having those tests be accurately Gaussian more than 2.5 standard deviations away from the mean is likely challenging. I've often suspected that on some of the SAT tests they didn't give any 800 scores.

There have been suggestions that those testing companies have not always been very open about just what they were doing in their details!


Also, to save much future grief, the undergraduate degree should be in maths.


I experienced how poorly CS classes prepared students for jobs in Software Engineering (I do realize they're not the same thing, but that's obviously the main degree we look for). I interviewed people who have a masters degree with an emphasis in Java, yet they were unaware of even the simplest details about how the JVM works (implementation details of String class, JIT compilation).

I felt bad for this person, I wonder if it's too late for them to get refund on that degree because it sure as heck didn't increase their earning potential.


I too have a degree in CS, which required me to 'play' with Java. Most students on the course did not have the basic concepts of Strings and ints let alone JIT. Most left with the ability to "avoid programming and CLI at all cost" and are still working in jobs they could have done without a degree paying £16-21K with zero career progression and / or training.



> As the Chinese say, I have lived in interesting times.

Nitpick: this is not a Chinese saying.

http://en.wikipedia.org/wiki/May_you_live_in_interesting_tim...


A string vibrates on me when in the online course about Machine Learning a voice said "you don't need to know what a derivate is ..."


Final year CS student in a British University here.

I can certainly see where the author's coming from, although I don't find the situation this dire. I'm not British so I don't know that much about how Universities worked and were perceived in society in the past and I guess I might have a slightly different mindset. Anyway, let me explain myself.

We have this modular course structure at my school. Yes, you can pick courses varying from "Developing web applications with Java" to hard CS stuff like compiler design and advanced algorithms. As far as I can tell, no course has been dropped because it was too hard - because students like challenges and take them. The same goes for final year projects, I've seen a student saying that she won't be doing any programming for her project (yeah, WTF), but I also know of more serious engineering projects (like a guy refreshing the electronics and more importantly, the software of a popular home-build 3D printer - and using the capabilities to do some stuff I'd really like to have on my 3d printer), or more experimental projects like mining Twitter for medical drug information (like perceived effectiveness, side-effects, usage patterns, etc).

What I'm trying to say is that there might be some easy paths you could take, but the students which always pick the easiest choice are usually the ones who end failing or dropping out. Sure, some of them graduate and I have mixed feelings about having the same degree as some of my fellow students. The author says that ' By pre-1990 standards about 20% of the students should have been failed.'. Well, in my school about 20% of the students are failed - each year.

Another topic is grade scaling. Yes, most lab and coursework grades are scaled in the first year and some in the second year. Exams are never scaled! But here's the thing, scaling is always down. It can be argued that labs are too easy if you need to scale the grades down and it is frustrating to do a perfect job and end up with a 70 something percent mark. But grades are never scaled up to 'turn a fail into a II'.

Finally, some people argue that a formal CS education is useless and out of touch with reality. I definitely don't agree. Knowing algorithms and data structures can give you an edge even for simple programs, knowing that some research areas and approaches even exist helps you avoid a lot of easy mistakes, labs will help you design better and faster because you sort of develop your own process and you get to know common pitfalls, reports and presentations train you to communicate better and using the proper domain language, having a clear image of how computers work from the grounds up is great when you're debugging.

TL;DR: Formal CS education still useful, just more chances to shoot yourself in the foot. (Yes, I consider getting a first without learning too much to be shooting yourself in the foot.) Study the fundamentals and the hard stuff and you should be better off than a self-taught person.


It's more than 20 years since I graduated with a CS degree and I think my advice these days to anyone considering the subject would be to avoid CS unless you want to go on and do research level work in academia or industry. Of course, that's what I wanted to do before going to University and I did end up working as a researcher in academia for six years before co-founding a startup.

People keep thinking that a CS degree is a vocational training program for developers - they didn't used to be and if that's what they have turned into then it's no wonder that they are doing a terrible job.


Then what do you suggest wantrepreneurs should study,if anything?I think this argument whether CS school is useless has been debated far too much.I do see value in developing the way of thinking and surrounding yourself with top people,considering you do manage to get in a top college.And this whole debate about the very high prices of the colleges is ridiculous,because even for a person coming from a really poor family like me,there are enough possibilities to get free or very cheap education in most of the world's top 20 CS schools,if they do believe you're worthy enough candidate.


Personally, if you can get into one of the very top universities that have a track record for being the places where succesful startups come from then I'd strongly recommend it.


Would you mind sharing which in Europe are those?Because I've read quite a lot of founder stories and the only college I'd put on that list is Stanford.UIUC also seems to have a lot of tech entrepreneur and VC alumni.


I'm not sure if there are any in Europe! [NB I'm in the UK]

Note that I'd love to be corrected on this point - I graduated in '88 and co-founded a startup after working in academia for 6 years and the help we got from the University we worked for was laughable (they asked if we wanted to lease a building!). However, we did meet our first angel investor through the university - he had done the same CS degree as me about 20 years earlier.


The conclusions are unfortunately correct, it is the student who is being made to pay for this policy. Bad course modules and bad degrees de-value the whole system and do NOT help those who scrape through but who are then unable to follow the level of skill required through to the workplace.

As an employer who has interviewed and employed countless computer degree candidates, this has simply devalued the word 'degree' to a point where it may no longer get you even to the interview ahead of say a non-degree candidate with a tangible track record of real project work or experience behind them.

This is not just a British phenomenon.


What are you looking for in a candidate? Are you looking for computer science? Or Software Engineering?

I hear people interviewers are expecting cs graduates to know the MVP pattern or similar things and shocked when they hear they don't know. That's software engineering and not computer science.


If a candidate has applied for such a position then I don't think it's unreasonable to expect that they've tainted their pristine porcelain skin with -gasp- a few run-of-the-mill software projects.


The SHOCKING Truth about a pootly organised educational system by a hugely unsuccessful government. Living standards in the UK are good due partly to the NHS and the private sector jobs, but the corrupt government attitude of take take take yet give back little is turning this country in to a 'toilet'.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: