This is what I was afraid of. I always thought interviews were something where you present your abilities, experience and knowledge. It should not be something you "prepare" for because then you're not showing off your skills, instead you're just regurgitating what you are trained to do.
I do somewhat blame the whiteboard culture, but I guess that's the one that has given the best results so far. I've no doubt that it will change soon.
The idea of having a whole class on how to solve interview whiteboard problems (a skill that is, at least where I work, not very relevant to doing your job) is terrifying.
In design school, we had one class which involved things like developing your portfolio, learning how to present yourselves to companies, interviews with Creative Directors, etc.
Given that technical interviews a common method of partially evaluating engineers/developers, it doesn't strike me as particularly odd for a practical course to exist that helps students maximize their ability to get a job. As you mentioned, it's really only one class out of dozens of others they'll take over several years. Myself, I took a number of electives that had far less practical impact on my career trajectory.
I don't think most people have a problem with the class itself. It's more that the technical interview has gotten away from a practical method of showing what you know, and is now becoming a skill to maximize itself, whether or not it actually demonstrates if you can do the job.
To take an absurd example, imagine if for some reason Law firms started requiring an Irish Stepdancing component to their hiring process. I'm sure broadly speaking, you'll see the most dedicated lawyers practicing away on nights and weekends, and the best Law Schools with the best students will hire the best Irish Stepdancing teachers. But at the end of the day, Irish Stepdancing has very little to do with practicing Law. Similarly, if solving stupid whiteboard problems has so little to do with being a Software Engineer, that they can't be learned in the 40+ other courses or internships/PT work, that a student will have to do in their time at University, then the question is why are they even done?
This isn't quite so far fetched as it seems. Perhaps someone tried the dancing thing to see how a candidate handles himself in an awkward position under pressure in front of people. They found some limited value in it and wrote a blog post about it. Then other law firms started using it. It became a way for such firms to assess a candidates emotional strength under pressure. And now law schools are offering a 1 credit course in "interview dance". Seems to be the perfect analogy to what's going on here ;-)
I’m Irish. WTF is Irish step dancing or Irish stepdancing?
On the correlation of irrelevant things to performance in a demanding job you know law school is at best as related to the practice of law as Bar review courses right?
At Google, the recruiter who schedules your interview specifically recommends reading books written about passing Google's interviews. It's a little unbelievable.
I would think there is a Venn diagram with an overlapping 'sweet spot': Good employees are probably more likely to get offers, as are candidates that prepare for an interview.
Similar experience, spent the same time preparing for the interview as for any other company + couple of hours of browsing some slides from university classes on data structures (sorry Google, you are not a special snowflake) - rejected after the first call:)
The idea that you now need to take a class to learn how to successfully interview with tech companies in order to prepare for interviews is terrifying.
It's nothing new when it comes to competitive programming, which whiteboard interviews are just watered-down versions of. Russian/EE/Chinese universities are doing that for ages:
I actually had a job where it was relevant. Solving whiteboard problems is half the job of a nuclear field operator. I had to do boards with the captain of the ship, senior enlisted, and the reactor officer to qualify watch stations. The other half of the job is cleaning bilges till you make first class or weasel your way into an office job. It's not a fun job by any means but it paid for my College so I could do something I enjoy, Computer Science.
This isn't that far removed from an algorithms or data structures class, or what you would learn in one of those programming olympics type prep courses that people do.
The reality is the majority of tech interviews today are filled with questions/tasks that have little or nothing to do with the actual job you are going to be assigned. In fact an average programmer doesn't actually need to know a single piece of computer science or math (almost all such stuff is abstracted by common libraries and frameworks, whatever is not can be looked up on demand) nor to memorize anything (thanks to auto-completion and all the docs and Q&As being just a couple of clicks away), let alone to have skills in solving puzzles. Yet every other interviewer will ask you to write code on paper and give you puzzles on computer science, physics and everything. They will also ask you about your previous salary and the reason you have left the previous job (both matters are purely private IMHO), why have you chosen them (ridiculous, I just want to earn living in a sound company and have sent the CV to a number of companies to land the first one that feels fine and wants me), what do you want to achieve in n years (never thought about this, just want a job) etc. And a funny thing: a friend of mine (an experienced developer with history in multinationals) has once told me he was rejected told being "too unsociable", "talking too little" which is the perfect opposite of what he actually is (dozens of his friends would undoubtedly confirm). Puzzles are fun, intelligence is reasonable to be benchmarked, nevertheless the interviewing thing is done at least slightly wrong usually IMHO. I just hope some scientific research is going to emerge in this area in near future to help rationalize the hiring process. As for myself I ask nothing but the candidate's opinion on some relevant subjects (the opinion itself doesn't really matter much but it can indicate certain degrees of awareness, understanding, intelligence and motivation, also the talk can somewhat demonstrate the personality) and their GitHub profile.
I have nothing in my GitHub profile. Some toy repos of no particular significance, one contribution to Rust that I've made before realizing it takes too much of my spare time... and that's it. I don't think that should make me un-hireable ... people that have kids tend to not have much spare time besides work, learning new stuff to stay relevant, and taking care of said kids. Not to mention, my company has me going through some bureaucratic process for any sort of open-source contribution that I intend to make.
> I have nothing in my GitHub profile. Some toy repos of no particular significance... I don't think that should make me un-hireable...
Same thing with me but I don't think it's hard to spend some weekends and write something to express your motivation an illustrate your coding skills.
> people that have kids tend to not have much spare time... my company has me going through some bureaucratic process for any sort of open-source contribution that I intend to make.
I understand your point perfectly but what I don't understand is how does it make sense to hire a coder whose code (a real (though not necessarily big) app code, not a 5-minute puzzle code) you have never seen.
> what I don't understand is how does it make sense to hire a coder whose code (a real (though not necessarily big) app code, not a 5-minute puzzle code) you have never seen.
Well, let's break this down a bit:
- if you hire a junior, all you care about is smartness & maybe culture fit. You get both of these better through interview than GitHub repo
- if you hire a senior, you hire for social/organizational skills, problem-solving skills, diverse expertise etc. GitHub is not particularly relevant
GitHub makes sense when you look for someone with deep specific expertise in a narrow, open-source-related technical area. Which is rare. I think it's more often used to check that "this person has coding style preferences similar to mine", which is IMO not really a great idea.
Makes sense, thanks. We are just speaking about different things then. All the times I were looking for coders I was interested in neither juniors (whom I would have to teach) nor seniors (who would lead the team) but middles (who would just write good code given a very specific task and very detailed design).
Having less time to "learning new stuff to stay relevant" makes us de-facto less hirable. The "github" thingy is just a proxy (as is the "whiteboard CS exercise").
I actually do learn new stuff. While on the current job, I've been doing:
- Code-hinting (semantic analysis) in an IDE
- Product management
- Low-level JIT codegen & optimisation (for ARM Neon & Intel SSE)
- Some datascience (technology/ product evaluation for a 8-digit aquisition)
- One of the early contributors to a web standard
- Distributed systems, bigdata (mostly graph-processing at quite big scale - Facebook-big)
and probably other stuff I can't remember right now. I worked in half dozen different programming languages. I'm pretty sure I've "stayed relevant" more than the average Joe, but you wouldn't know it from github.
I suspect that Github repos, SO accounts, toy problems etc. are simple tools interviewers use to make some form of rigorous decision when the applicant's CV gives them little concrete evidence of skill. A couple of years of job experience will speak far louder than those at any company you would want to work for. Especially if you employed in a similar position at the time.
StackOverflow and Github repos are a wealth of information about a possible applicant. There's the code... yes. But there's also the comments on Stack Overflow and issues on Github.
Given that the person is presenting StackOverflow as an example of their profession, comments on SO would be very telling for how the person would be responding to critiques of their code and their responsiveness to questions. Much can be seen in the attitude and professionalism that would be seen in email. Things like:
* Do they write in complete sentences?
* Do they use spelling and grammar that would be appropriate to send to a client? A director?
* Do they range against all that is wrong with the world over little things?
Likewise, with Github there are issues that they have logged on other projects - do these issues contain sufficient information about the problem so that the person working on the issue can diagnose it? For issues on their own projects or projects they contribute to (and have taken ownership of the issue), are they responsive and provide useful information to help someone provide a good bug report?
I'm not too concerned with the code. That's easy to fix. But a person's attitude and providing these sites as an example of their professional mindset is something that can be difficult to get at in an interview when they are only trying to put their best on display.
I don't think these kind of coding interviews are given to average programmers though, and if they are I'm sure they are taken with a pinch of salt (or with a lot of help from the recruiter, or using simple tasks such as FizzBuzz [1]).
In my experience, having been an average programmer (and maybe still being, I hope that by now I'm at least slightly above average) who worked in The Netherlands at a "system integrator", a web dev "sweat shop" (some projects using 10 year old tech, relatively high staff turnover) and even a startup, such companies don't ask for whiteboard coding, however take home projects are quite common.
I also interviewed at some more "hot" unicorns in Japan, which do have these kind of questions (using a dehumanizing online test in fact).
FizzBuzz test is actually pretty great interview question, but most interviewers miss one crucial ingredient: time pressure. Original idea was that any competent programmer should be able to solve it in under 1 minute (only limiting factor being your own writing/typing speed). It is a good indicator of familiarity with the basic programming constructs (loops, conditionals, print statements). It's like writing down the alphabet - you should be able to do that without thinking. I have seen many developers who fail to do that.
> I have seen many developers who fail to do that.
How do they happen to be developers then? Doesn't being unable to implement a FizzBuzz quickly imply they can't code anything that is more complex, by definition?
By the way although I do understand programming (OOP, FP, SQL) and can develop a considerably complex app from scratch in an IDE like Visual Studio with ReSharper or IntelliJ Idea that would expand code snippets, offer auto-completion and highlight errors on the fly I doubt I can code anything more complex than a "hello world" using just Notepad and expect it to compile and run without errors immediately - I use to make small typos and miss some syntactical elements all the time thanks to my ADHD (but modern IDEs solve this problem). Also, a person standing behind my shoulder watching me code and counting time would decrease my performance by an order of magnitude at best. Does this count as a failure from your point of view?
I will highlight the most glaring problem I find in your take.
> FizzBuzz test is actually pretty great interview question [...] you should be able to do that without thinking
If you are going to code for me, I'm not interested in what you can do in 1 minute, I'm interested in what you can do in 1 year. I want you to spend more than 1 minute just thinking about any problem that is remotely worth solving. And that's the underlying problem: FizzBuzz is not representative of the problems I hire you to solve, and solving it fast is not representative of the approach I want you to take for solving the problems I hire you to solve.
FizzBuzz can be a reasonable point to start a conversation about approaches and style. But that's true of almost any piece of code - I used strcmp() for a number of years.
If you are not able to write FizBizz without thinking that means that you've probably written no more than a handful of for loops and if statements in your whole programming "career". That's a very bad indicator for someone who applies for a programming job.
It's a fitness test. Are you going to bring me value, or are you on the verge of burnout with outdated skills, trying to live off your past? "What did you do for me today?" is what matters for company to stay afloat and brutal competition is the way to achieve it. What kind of society it creates is another question.
I'm a proponent of whiteboard tests and am in charge of conducting for potential hires on my team.
However I don't see how they guard against an outdated skillset? My company employees five Infor Sys21 RPG programmers who thought the term "web service" was synonymous with SOAP until I ran a workshop.4 out of 5 worked most of their career for IBM and all have bachelor's or masters in CS, EE, or Math. With a weekend of refreshing they would definitely pass a whiteboard for me (although not necessarily with the most optimized solution). They are all over 50 btw. And work slow and steady (but efficient) with no burnout in site.
I can come up with a "good" solution for nearly every question related to algo on leetcode and have very in demand skills as well as been the top SO poster and core open source developer on a Java platform that is #1 in a Gartner magic quadrant. At 27 and am on the verge of burnout that as happened as a result of getting large Adderall / Vyvanse scripts (legally) and being on the computer for nearly 14 hours every day.
I read this and I wonder if you will understand what I'm saying when I tell you that it is specifically BECAUSE proponents like you fuck themselves up beyond comprehension that I consider whiteboard tests to be a massive red flag when I'm interviewing at a company.
Your career should be a source of service, joy, growth and challenge for your entire life.
Don't destroy it for yourself (and others!) right at the beginning.
If you can afford to, take 6-12 months break, travel the world, meet new people, get some hobbies that are opposite of programming like model photography or playing/making music with multiple people. 14h/day behind the screen is pathological and you need to learn how to counteract it. You might get some peak off it, but it's short-lasted and won't help you in a career (if there is any long-term career in programming is another question).
Businesses with this attitude need not software specialists, but need to hire through (sub)*x-contracting. Just go to a global chicken farm like capgemini/accenture and get the required workforce. This solution has multiple downsides though.
The same kind of companies that when times comes to fire people they will just randomly choose X people from each department, regardless of their skills or sweat given to the company.
So much for the fitness test, maybe we should have one for employers as well.
You want too see public projects on my github profile? I'll work on my public projects during office hours, deal? There are companies which become hysteric on the mere mentioning of the words "public", "open source", and "GPL" and doing it, while already employed, will undoubtedly initiate multiple meetings with a bunch of security and legal guys staring judgmentally at you. I hope such companies are more forgiving when candidate has no public github/gitlab/bitbucket.
> FWIW, preparing for interviews is standard practice irrespective of the industry.
This is simply wrong. I suppose if you've spent your whole life as a developer, it may seem like it would be done likewise other industries, but it's not the case.
Typically, how you might "prepare" for interviews in other industries is...learning about your potential employer. And maybe reviewing your accomplishments. That's it. And most people don't even do those things. None of this hours and hours reviewing textbooks from your college classes, since you've been in the same job for 5 or 7 years and have forgotten algorithms that you never use in your job. It's simply perverse.
Software development is the only industry I'm aware in which technical preparation for job interviews is not only advisable, but increasingly necessary for interviews at all levels of the career hierarchy.
Even in other engineering disciplines, this kind of bullshit is unheard of for any interviews except maybe your first interview out of college. And in industries like law, medicine, etc., if a potential employer asked you to perform some random task from a highly technical and complex subject from med school or law school, it would be very unusual.
I agree completely. And this is why an industry-wide respected, non-profit, credentialing mechanism increasingly makes more sense to me. There are a hundred pitfalls to implementing it and some we may never fully overcome, I'm aware, but I think they're worth grappling with to get this very technical and increasingly high-stakes line of work onto a more solid career footing with its big brothers in the engineering disciplines.
I’ve been coming around to this as well. I don’t object to an industry wide exam based on data structures, algorithms binary operations, and design. My main objection is that the exam is taken repeatedly at each interview and scored under secretive conditions.
Actuaries aren’t expected to prepare for and pass a vector calculus and linear algebra exam every time they interview, but they are required to pass a proper exam consistently administered and graded.
This might be a big improvement in our field as well.
Yep actuaries have the right idea, the exams are pretty hard, and require a lot of preparation, but once your done that's it. Actuarial job interviews are more about what kind of person you are, and if you'll work well in a team.
Quantitative finance is quite similar in this regard. Maybe I don't fly high enough to meet people who wouldn't get a technical interview, but I assume that Jeff Dean wouldn't either.
> And in industries like law, medicine, etc., if a potential employer asked you to perform some random task from a highly technical and complex subject from med school or law school, it would be very unusual.
I'm not a developer, so maybe this sounds absurd, but as someone who has good understanding of how medicine works, maybe this industry could use a few standardized exams/certifications?
I think it's possible that, because a practicing physician will have passed the MCAT, USMLE Step 1/2/3, and have obtained a license, they don't have to deal with this sort of bullshit in interviews. It seems like the idea of a "licensed software developer" may never exist, but maybe that's why the hiring process is the way it is.
> maybe this industry could use a few standardized exams/certifications
I doubt it. for example having OCJP doesn't prevent interviewers from asking basic Java questions. It is like they are telemarketers with prepared script and aren't flexible enough to skip some parts no matter what.
Typically, how you might "prepare" for interviews in other industries is...learning about your potential employer.
It has always shocked me that people don't instinctively think that doing this is an absolute necessity for every single job interview. I have very often been met with a totally blank look when I've asked candidates what they think it is the company I'm interviewing them for does, even when phrased as nicely as, "Tell me a little about what we do and why you'd like to work for us" - surely not a hard question, you would think!
Not so much hard, more that it sounds like a complete waste of time and could be signaling to people that this is one more non-technical/HR interview they have to get through before getting to talk to people they'd be working with and who can fill them in on details specific to their prospective job or team. Whatever you're trying to figure out with that question, there's gotta be a better way to probe that, that doesn't make it look like you're wasting the available time.
This only works depending on the scale of the company.
If the company is huge and involved in many things there is less engagement from each employee with the overall mission of the company. This is even more pronounced when employees can't even see the fruit of their work in society. Like when the developer workforce of a company is spread across many countries but the clients are focused only in one company.
I think expecting candidates to have a basic idea what the company does is fine.
But unless the company is a small startup, expecting them to be engaged with the mission is unreasonable.
There are many reasons for candidates to be excited to work for a company:
- friends
- great team fit
- technological challenges or preferences
- work-life balance
- location
These and others contribute to an employee doing his job with passion.
(note: money is not one of them. there is a study that showed that above a certain level for a person, no amount of extra remuneration will result in an increase in productivity)
And I imagine this is the case even for pinnacles of vision like SpaceX. I imagine employees at SpeceX do not care about the future of SpaceX , they care about putting man on mars, making space available cheaply, pushing the boundaries of human exploration and of technological capabilities, Musks vision and enthusiasm. But not SpaceX as a company.
People should really stop expecting companies to have emotions. Companies do not care about people so people should not care about companies either. People care about other people and society at large.
I was talking about for the purposes of the job interview, I though that was obvious. Also you want to look a bit at the companies financials to get a feel for any equity based comp and also if the company looks like its in trouble - eg Carillion in the UK
> Whatever you're trying to figure out with that question, there's gotta be a better way to probe that, that doesn't make it look like you're wasting the available time.
I would expect during an onsite that interviewers would want to tell the candidate about what the company does and why it is a great place to work. If you need to ask candidates those questions during an in-person interview then I recommend adjusting the filter prior to that step or consider that you might be getting blank stares for a couple of different reasons.
And yet every hiring manager on here can tell you stories of hiring someone who passed all the screens, but on the job could not write a single line of competent code. The industry provides too much cover to fake it with a title that sounds technical but isn't. You might hire a musician based on hours of discussion of their deep knowlege of violin theory, but it would still be a mistake without hearing them play.
On the same line of thought, you wouldn't hire a violinist just because you heard them play the scales.
In music, you often get handed some sheet music or asked to prepare something.
In the cooking industry, you throw them at a kitchen and tell them to impress you.
In the programming industry... We ask people to write toys and logic puzzles. That isn't what they'll spend their days doing, that's just practice, like scales.
That’s probably because you can’t write a production-ready service or application from scratch over the course of an interview.
A violinist might be asked to play a technically difficult but short piece during an audition, even when most of the music they would play as a member of the orchestra would be less challenging and take a much longer time to play, with plenty of time for rehearsing ahead of time. Likewise, a software engineer is ideally asked to solve a small but technically challenging problem during the interview.
> A violinist might be asked to play a technically difficult but short piece during an audition,
No. Compared to what we do as software developers, the correct analogy for a violinist interview would be to ask them to whiteboard all kinds of obscure music theory principles, like the set theory underlying serialism and Arnold Schoenberg's twelve-tone technique.
Depends on what you mean by "preparing". There isn't another engineering industry I can think of where an engineer with more than a couple of years of experience is going to be asked to whiteboard this sort of crap. I mean, my aerospace engineer friends don't have to prepare by reviewing their PDE, vector calculus, and linear algebra classes, which is what the analog to programming interviews of this sort would be.
Sister is a travel nurse. She does not prepare for interviews and typically has several a year. Her entire interview process goes: call the agency and tell them she is looking for a new position, have one to several 30-45minute phone calls with hiring managers at the hospitals she is interested in (she actually took one once in the middle of a wine tasting - and got the job), then wait to hear back.
That's what it's like for many other in-demand and well paying industries (she's mid-20's and makes low six figures in San Jose). Stop pretending like everyone else has to deal with the same bullshit as you, hiring in tech is BROKEN. Other industries do it better.
People's lives are dependent on how well she does her job. Are the hoops you have to jump through justifiable for how "important" your job is?
As an aside, what is the range for "low six figures?" I hear this term a lot, and it seems ambiguous to me. If I hear "low 90's" I can reasonably assume between 90-93, but what does low six figures mean? Is it 100k-103k as an analogue to the 90's example? Or is 100k-300k? If the former, than it's probably most useful just to say 100k. If the latter, then that range is too wide to mean anything.
I agree with you that as a term “low six figures” is odd. 300k should mean low six figures.
In the vernacular, with regards to salary, it is used to describe a range between 100 and 200k. So low six figures tends to mean closer to 100k. That’s very consistent in usage.
It is a little odd though, I would agree about that.
Indeed. If it was illegal to work as a programmer without a Master’s degree in Computer Science with a GPA above 3.0 the interview process would not look like the current one.
The industry wouldn't look the same, either: it wouldn't be nearly as profitable or widespread as it has become, and nobody but CS academics would care about it.
Quite, we should also eliminate nurses. The minimum qualification to work in healthcare should be a medical degree, just like we shouldn’t allow any car that’s worse in an accident than a Tesla on the road.
There is not one single shared course between the two degrees. If you can find a course shared between medical and nursing students in all of Europe I'd be surprised.
Could you furnish me an example from Western Europe please then? I am familiar enough with the Romance languages to read Portuguese or Spanish if there’s a university with any/substantial overlap between nursing and medicine.
What usually happens is that nursery faculties sometimes share a few lectures with medicine faculties, for things like biochemistry, biophysics.
It is not examples that one finds online, rather on corridors and schedule notes.
What I can provide as an example is that nurses and doctors have equal access to many master and PhDs. Check "condições de acesso" in some of the links from the page and you will find "enfermagem" as accepted degree.
I've been through an acquisition or two in my day (on the acquiring side). I'd echo knicholes' comment about that and go further: in my experience code quality and degree credential don't seem to have any relationship.
The industry is different, that is another issue.
Programmers do not deal with just the small variations of the mostly the same code base (human body) which broke down (got ill) for some reason, but most bugs have pretty well documented symptoms and prescriptions how to fix them.
> hiring in tech is BROKEN. Other industries do it better.
Can you explain how other industries do it better and come up with a better, plausible, non-fantastical way to interview programmers? I'd love to know how to interview people "better", but it still has to be something where I can later present evidence that my decision is based on, in some kind of report. It would have to allow multiple people at my company to interact with a candidate in a half day or less. Copying what another industry does would be great, if it worked, because it'd help convince people it was a good method to try out.
> Are the hoops you have to jump through justifiable for how "important" your job is?
I never thought a 1 hour phone screen and a few hours onsite was particularly onerous, and based on how much the job pays, it seems well worth it.
> People's lives are dependent on how well she does her job.
I'm not sure why you seem to think that saving lives correlates with strictness of interviewing, especially with the 2 jobs being so drastically different.
What is unusual is that we essentially ask people to set aside and forget everything they know about how to do their jobs, and instead perform a separate set of skills which, since they are't used on the job, must be carefully re-acquired whenever it's time to go interview.
* Is attractive enough (compensation, lifestyle, etc.)
* and has a larger supply of qualified applicants than desirable positions
Will result in applicants preparing for however they are interviewed, tested, etc. This isn't unique to software engineering interviews. People prepare for interviews or exams in finance, law, medicine, etc. They all have a system that is gameable. It might be newer in software engineering interviews, but that is mostly because CS is now incredibly popular and top talent is compensated very well.
The funny thing is that software engineering does this while also complaining of labor shortages. Or, if we're being a bit more honest, it's not funny, it's cynically self-serving.
I'd be willing to take either side of a debate for "is there a shortage of people who, when given tight specifications for a piece of software (write a function that takes X as input and provides Y as output) that enumerates all of the corner conditions can produce a program in a language that they are familiar with."
At the same time, the "person who can work with the business user on the software, think about the architecture of it, identify the design necessary, come up with the estimate that actually matches the time frame that it will be done in with a reasonable error... and produce software that takes X as input and provides Y as output while being aware of where the edge conditions may exist and ask for clarification on how it should work"... I believe there is a distinct lack of that portion of labor.
Furthermore, there is also a lack of people who are able to move from the first labor pool to the second, and a lack of mentors who have the time and ability to help that group move to the second.
I don't think its incredibly difficult to hire an entry level person as long as one sets the bar low enough and has people within the origination who are capable of providing the design. On the other hand, it is very difficult to find the people who can give the necessary instruction to the entry level people to allow them to become productive within their ability.
As an aside, I also find that within the entry level group... there are a sizable portion that have the attitude of "I learned language X and that was hard enough, I'm going to stick with it and not learn anything else." That X can be found for all languages and none have the monopoly on it. However, it is disconcerting for me to see those individuals... I started out as a C programmer, and then Perl (full stack web - some JavaScript in there) and then Java (enterprise), and then Java stand alone (swing application)... and while I'm still a Java programmer, I can see other languages looming on the horizon. Java will become the COBOL, and while there are still COBOL programmers out there, its not something that one wants to get stuck in for another two or three decades waiting for that last app server to be turned off before they can retire.
That portion of labor is lacking, because it is not rewarded. It is not searched for during interview, it is not valued in mythology (what we consider cool when on blogs and forums) and lastly is not rewarded by employers at work. Notably, actually predictably matching timeframe is devalued where people who end up pulin overnights due to bad planing, lack of comunication/negotiation are seen as heroes and rewarded.
Why would culture developed skills that make you rewarded less?
The ability to teach juniors is not searched for nor rewarded either, where social skills are even sometimes treated as something that makes you less good as programmer.
> "I learned language X and that was hard enough, I'm going to stick with it and not learn anything else."
There's also a sizeable portion of companies that have the attitude of "we want someone who's really passionate and good at learning. oh and they need to be expert in language x. oh you're not an expert in x? sorry, we can't wait for you to learn it. we don't care if you learnt something else"
Its something I've thought about for a bit and want to make a proper blog post of it. So far, its just a post over in a slack channel associated with That Conference - https://slack-files.com/T0CEWBUEP-F7J3SNNNT-00ff1af0fe . I need to come back and revisit it at some point.
That you think interviews and exams are interchangeable concepts illustrates the problem. Nothing wrong with a rigorous filter at the gates of the profession, but lawyers don’t need to spend months studying for each company’s half-assed version of the bar exam each time they change jobs. Our industry is dysfunctional because, unlike law and medicine, no one trusts the credentialing mechanisms.
Credentialing isn't even the beginning: no one trusts other software companies, either, apparently. An engineer at Google will face a battery of asinine CS trivia when he interviews at FAAN, just the same as he faced interviewing at G. Hell, he might even face similar interviews if he applied at non-FAANG companies.
What credentialing mechanisms? As we’re told over and over a CS degree is not a trade qualification. Having a CS degree is no guarantee of being able to program either. The example of Microsoft and Cisco qualifications is not stellar either. They’re often seen as a negative signal. And most working programmers don’t have a relevant credential unless you count a Bachelor’s degree which is more a certificate of middle class membership than necessary to work as a programmer.
Our industry also has quite a lot of self-taught professionals, who can perform as well as their college educated counterparts, which is something you won't really get in professions like Law or Medicine. I think that's a good thing, when it comes to software.
It is quite different to have spent 5 years doing nothing other than building up engineering knowledge, and just learning a few things to get the job done.
It's not "just learning a few things". Some software engineers have years of experience on the job despite no CS education. In the end it's about how much value you bring to the company. If your diploma helps you deliver that value, great. If not, your diploma is irrelevant.
I'm always very weary of companies that tell me I'll earn more money because I have a PhD completely irrelevant to the job I'll be hired to do. It signals that they're valuing the wrong thing, and that doesn't make me want to work there.
It is the knowledge gathered along 5 years of studies, the activities that put into practice that knowledge and the certification of the quality of the teachings.
So, a credential? In the countries and states where the title is protected (it's not everywhere), you often have to pay dues to a professional organization to claim the title. It's not enough to have the diploma, you have to pay every year too.
You didn't get that I was hinting that countries that don't have strict rules about who can call themselves engineers invented the core tech behind computers ditto Ethernet which was US and UK.
It's an unfortunate situation honestly, because it pollutes the term "engineer" (at least for Europeans), when it's being conflated with "I just know how to code".
while it might to some seem slightly controversial, it would make more sense to argue against it instead of just down voting.
Would you disagree if people from the US did not need to pass the bar to call themselves lawyers, or the equivalent for doctors? Is it just because I compared this specific thing from the US in a negative light compared to EU?
I'd make a different point: the CS degree isn't a hoax, but it's not strictly necessary for software engineering work. In a real world application systems very quickly escape the boundaries of undergrad CS facts.
I don't think so. Computers and software are simply more accessible for laymen to dig into and teach themselves something.
You can't learn and practice surgery yourself. On the other hand, you can teach yourself software engineering. You can be practitioner with nothing more than a computer - and so many people are.
That doesn't mean CS degrees are a hoax - it just means the field is more accessible to those without college degrees (though its definitely going to be a more difficult path).
I think the OP is implying through that this is creeping into top American university curricula. Having a Stanford degree in Comp. Sci. should correlate with interviewing success at relevant job openings for new graduates. This kind of "teaching to the test" and time spent on "job preparedness" is typical of community colleges and trade schools. They serve an important function, but it's distinct from the mission of a place like Stanford. As for the other aspects of job hunting school's career service department can and should handle the soft skills and referrals.
This comment reeks of elitism. In my experience, new grads from top American universities are incredibly intelligent, but their community college and trade school counterparts score much higher on job preparedness, an incredibly marketable and bankable skill. As someone with a top American university degree, I’d have loved a course in job preparedness, and have learned a ton from coworkers who have succeeded as Software Engineers and also attended community colleges and trade schools.
Universities exist to search for truth and to train future generations in the methods and results of the ongoing search for truth. We give these institutions donations and tax money so that they can be independent of the economy, and work on what the market won't. You could look at the National Science Foundation and other federal grant programs as a form of basic income, providing grad students with a minimal standard of living so that they can do the important intellectual work that private capital doesn't want to buy.
Job preparedness training is looked down upon within universities because it's the banal economy poking its nose where it doesn't belong. The industry or a professional association can and should develop job-preparedness-focused training programs to augment (as in law and medicine) or even replace (as in HVAC repair and precision machining) CS undergrad. Nothing wrong with that. They just don't belong in academic departments. Those are for something else, and industry's training needs shouldn't be subsidized.
An employer should value a relevant trade school degree more than it values a Bachelor's, unless it turns out that well-rounded intellectuals with exposure to the underlying discipline perform better than those with direct training in the relevant activities.
Thank you for taking the time to write this - very well thought out and well reasoned.
I don't think one can ignore the disconnect between the stated charter of top universities (basically to do research) and the motivations for the majority of undergraduates who actually attend them (prestige, which hopefully leads to a high paying job). I doubt that all donors give these institutions donations to further their research goals. I know for one that if I were given the option to focus the impact of my donation, I'd readily choose it be applied to a practical, skills-focused program within the research-focused university that I attended. Furthermore, the undergraduate programs of top private universities in the United States aren't funded by tax payer dollars - they're funded by sky-high tuition. As the OP shared a class applying to the undergraduate curriculum at Stanford, a tax money argument isn't relevant here. If the course were geared toward Masters or PhD students - different story.
Finally, universities seem incredibly resistant to re-evaluating their place in society in the 21st century economy -- at least in the U.S. From childhood, a college degree is pushed as a necessity for financial success. Top universities like Stanford are sought out not just for their academic rigor, but for the network and prestige afforded to their alumni, who can parlay these things into a strong career path and a high paying job. Univerisities seem deaf to this reality. I'm sure they understand it, but for whatever reason are reticent to acknowledge that their role in society and to the economy has evolved since their founding.
I think Stanford should be applauded for offering this course - would love to know how many students registered.
The reticence may be frustrating, but it shouldn't be mysterious. Why do we have libraries? Symphony orchestras? Art museums? We should not compromise university to meet workforce training needs, any more than we should pave over Central Park for office space.
That's not to say we ban career training (or office buildings), but some space is and ought to be reserved for other things.
Students seeking higher education for purely economic reasons are wrong to do so, and a gauntlet has been thrown to the business community to launch serious competitors that more directly address its training needs.
I think you have the Central Park analogy backwards here. What Central Park provides (nature and openness) is scarce among what surrounds it (an incredibly urban and dense landscape). On the contrary, research institutions compose the majority of top U.S. universities, and are not in danger of being supplanted by something else should they adapt their charters. Similarly, NYC wouldn't be putting it's urban character at risk by zoning more parks.
I have trouble understanding why there can be no middle ground between the status quo and pure vocational training. As I mentioned in a comment below, most arguments against middle ground that I've heard boil down to, at the core, nothing more than circular-reasoning like, "this is the way it's always been, and we want it to stay this way because things have always been this way." Why don't universities make an addendum to their charters to maintain the goal of being world-class research institutions AND provide pragmatic, real-world training? What sacrifice would be made? Would the quality of university research truly decline?
Whether students are wrong to pursue higher education for purely economic reasons is a matter of opinion, not a matter of fact. What is a fact is that many students do pursue higher education for purely economic reasons, because society tells them from a young age to do so, and it's reasonable to assume that a degree from a top institution in a technical field will lead to a high paying job. The business community certainly could do more to shoulder any perceived burden for professional training and/or certification; however, traditionally research-focused universities should also work to better support the motivations of the majority of their undergraduate (and paying) student body, as Stanford has done here.
>What Central Park provides (nature) is scarce among what surrounds it (an incredibly urban and dense landscape). On the contrary, research institutions compose the majority of top U.S. universities, and are not in danger of being supplanted by something else should they adapt their charters.
What universities provide (liberal education, basic research, etc.) is scarce among what surrounds them (capitalism). Time is finite; the more of it we spend on workforce training, the less of it there is for an undergraduate education's current content.
Maybe the current content is not worthwhile, so we may as well gut it and reuse the facilities/framework for vocational training, but surely you recognize that doing so makes the current content less available.
>traditionally research-focused universities should also work to better support the motivations of the majority of their undergraduate (and paying) student body
Customer-service mindset is a poor fit for education. If "the customer is always right," why bother with teaching?
This mentality is why an entire generation of college graduates is completely unemployable. If you’re going to take this approach, fine, but let’s be honest about it and stop telling kids that university exists for any reason other than to let rich people futz around with intellectual naval-gazing, and let’s stop sending kids to college when what they really need is marketable skills.
Exactly. I don't understand why universities aren't willing to make an addendum to their charters to be both research and job-skills focused. I've yet to hear a good argument for why this can't be done. In my opinion, most arguments boil down to "because we've never done it that way."
Why should interviews test your ability to solve a problem you never expected to have to solve, rather than test your ability to prepare, research, and teach yourself?
its not going to help you when something out of left field comes at you.
For example a couple of years into my first job I got to work to be told we have just brought an a0 digitizer ( cost about 2x my then salary) id like you to hook it up to the PDP 11 and get it working. I should point out there where no drivers or software.
I even had to make the serial cable up then write the drivers using some of the more advanced bits of RT11 to run two processes with the driver interrupting the main program when data came in on the rs232 port.
Till today I haven't asked anyone to write actual code on whiteboard. Recently I had a chance to interview a Stanford grad who had previously worked on embedded sys in some start-up. I asked him to draw system level architecture of his product in OSI layer format like where HAL/ firmware, RTOS, insys mem, db, application, UI are placed and how they are communicating with each other ...and this guy was staring at me like I am asking him to jump down from 10th floor. Then I thought did I do something wrong... I mean if you have really worked on/experienced something then you should know that thing thoroughly and then you don't need any special prep for interviews
I've never felt comfortable answering questions like those, because the designs are owned by the business. Even if I made it, I don't feel like I have the right to hand a design for my current employer to my next employer.
I don't think drawing just a block diagram of system means giving away the trade secrets of your product. I didn't ask him to draw flow charts of algorithms and business logic. And I don't even do that.
I have seen many interviews purposefully conducted by rival companies where interviewers are specifically asked by company's marketing team to drill down interviewee on particular functionality and get details of business algorithms, system performance etc. Offcourse this happens when there is cut-throat competition and few players.
I think it's rather silly to "prepare" for an interview, but IMHO what they expect at Google/Facebook/Amazon/Microsoft is not something you should really need to prepare for if you took the time to really understand the algorithms and general techniques you study as a CS undergrad.
Rehearsing at a relaxed pace over a single weekend should be more than enough, particularly to get used to discuss your ideas and using a whiteboard.
You're absolutely right. The problem with these kinds of courses is that you will now be competing with people who have been specifically taught how to prepare for and crack technical interviews.
I am pretty sure that they will perform better than people who just spend a single weekend rehearsing.
I completely agree. Every time I start a job search I dust off my copy of "Cracking the Coding Interview", do a half dozen problems on HackerRank and I'm good to go. Sure, it's not exactly the skillset a professional software engineer needs -- but if you're a professional software engineer it really shouldn't be that hard.
Wholeheartedly agree. These companies most likely are trying t o understand the 'why' behind your algorithm choices. Of course, there are exceptions but largely this is the case. Understanding the why only comes to you, if you really understand the algorithms and are able to map those algorithms to use cases.
Either I’m a bad student or you have an inherent ability that I don’t have. I’ve interviewed at all of those companies and you need a lot more than a weekend to succeed - even my friends with offers took more time to study than that!
Note that it doesn't just take passing grades and a single weekend.
Gaining deep understanding is the trick, and it comes from constantly being exposed to CS-like problems.
This doesn't need to become something intensive like studying before exams, in fact having a deadline might be really bad for it as you won't think about each thing enough time and the pressure makes you less prone to wonder about the problems.
It just needs to get you passively thinking about a problem, mostly abusing on your "background brain-threads" and taking the time to make yourself deeper questions around the problem and the data-structures used, like which problems are similar? which are the properties of the problem that enabled some approach? what's key for the DS to be useful on that problem? are there lower/upper bounds to the solution? why something doesn't work?.
Some quick research later on the problem might show you a better way, and then the right question to ask is what did you missed to come up with that, did you ignored some property of the problem that the "right" approach exploited? was it just something you didn't knew about?
Going for those kind of questions will make you better at analyzing problems and come up with some strategy to solve them, and when you have that clear getting it written shouldn't take much effort.
Software engineering does not consist of preparing for and then giving short/intense performances.
Plenty of fields do - performing arts, film, trading, etc. all involve some form of short, intense, expensive activity to which you show up prepared and work in a burst of superhuman activity. But software isn’t like that at all - you get a problem and you dig into it, mull it over, research, etc. at your leisure until it’s done. We’re novelists, not actors.
This is true if you have very junior level responsibility, but as soon as you need to take any kind of leadership role others will be looking to you to contribute in a manner that is not ad hoc.
Also, you mention research. Research and preparation are synonymous.
Nope, not even slightly. Research is something you do after you know the problem at hand, to see if you find anything helpful. Preparation is something you do before you know the problem to minimize reaction time once the problem is revealed. Preparation is wildly inefficient, as it involves studying a bunch of material that will not turn out to be needed, just in case.
If you're writing software under the kind of time pressure where consulting reference material is unacceptable, something is deeply wrong. Most software engineers, most of the time, should be able to research topics as they come up rather than prepare (beyond the standard preparation in college).
Making long-term plans, building consensus around them, etc. is important, but is nothing like practicing for whiteboard interviews.
So wrong in so many different ways. The questions you research / prep for may be needed at some point in your career.
Much like programming. You program for all relevant edge cases, as one might be used at some point.
People who don’t prep are horribly lazy and are terrible at enumerating edge cases. The have buggy code that fails at some point. Laziness is by far the way worse quality? Though not the only sin.
Lack of sincere desire to be helping the team succeed and no innate talent are bad too.
>The questions you research / prep for may be needed at some point in your career.
And most people, most of the time, can and should research them as needed.
>You program for all relevant edge cases, as one might be used at some point.
Yes, because a program doesn't get to pause execution and defer to the human mind to ask "hmm, what do I do in this case?" A programmer does so all day every day.
>are terrible at enumerating edge cases
Any attempt to pre-compute the edge cases of all possible programming situations will be hopelessly inadequate. Enumerating edge cases requires analytical thinking in the moment, essentially the opposite of preparation.
This is wrong. If I want to handle a problem by researching it, I have a goal in mind ("solve problem X"), and I'll look for things that will help with it.
Interview preparation specifically avoids that approach. Instead, you're supposed to become familiar with every possible problem, in case it comes up during the interview. Most of them, obviously, won't, and from a "research" perspective that means that nearly 100% of the time you spent in preparation was wasted.
One more thing is that, research takes time. Companies want ability to cut short this time which is required from inception to delivery of a product. They prefer a candidate who already has prepared linear algebra, if you are going to work on deep learning at the company.
Like every other field, you need to practice and you need experience to do well. Maybe you can dive into a new framework without trying out any tutorials, but not everyone.
Sure. But at work, I know what I need to prepare for. Interviews don't exactly give you the questions ahead of time allowing you to prepare. So, preparation is not something they are testing for.
The idea of screening for a software engineering gene is patently absurd. In an ideal scenario, we screen for the ability to do the work the job requires. In the normal scenario, we screen for some mix of that and the ability to apply cookie-cutter techniques to (hopefully) novel problems in the space of ~45 minutes. Some companies also ask you to do a sample practical problem either on site or on your own time. None of this indicates anything about someone's "innate genetic talent", if there is any such thing.
And it’s illegal to try and screen on anything except ability to do the job. Be very careful if you find yourself thinking in terms of software engineering interviews being a general intelligence test — you are setting yourself up for a discrimination suit.
As you clearly demonstrate, intelligence has a genetic basis. However, it does not follow that there is variation among humans at the relevant genetic loci. The variation may have become fixed in an ancestral population of modern humans.
What I gave there is an absolutely standard assessment of the situation from the point of view of population genetics. Of course intelligence "is genetic" in some sense: that's why dogs and cucumbers are less intelligent than humans for most definitions of "intelligent". The question is about the phenotypic significance of current intraspecific genetic variation.
One of the things I miss about being an orchestral musician is that you always knew what you were getting into when you went into an audition. You've known the pool of excerpts for your instrument since you were in college; you've been working on them for years; you've probably known the excerpts for this particular audition for months if not a year.
Every one of them is the same: you go in behind a screen, and you have to play those few minutes of music better than anyone else that day. It's hard to do because they choose the hardest stuff, but you know what you're up against.
It's very imperfect. It doesn't reflect most of what you're going to be doing in the job. You might play one of those pieces in a given season. The bulk of your time is spent listening to the players around you and playing in sync with them, understanding what the conductor wants, etc. There's a lot more to being a good orchestral musician than being able to nail your part for Don Juan completely alone without context.
It's not great, but it is consistent. Evert tech interview I've had in person has been wildly different. Even since I learned pretty early on to ask about the process. You still have no idea. It can be anything from an entire dev team grilling each other in a pissing match that had almost nothing to do with me, to 1:1 with someone grilling me about the finer points of a PhD topic they just finished years studying to a casual conversation over coffee just to get to know you to live coding exercises where you have an hour to write a 3-d car driving game to a conversation with part of the team where they tell you about their real-world problems and ask you for ideas about solutions to getting paid to work on a live project. All for web developer or data engineering positions.
I remember being in one interviewer role a while back for a dev ops person for the team. I got thrown into the mix at the last minute because the other interviewers realized the day of the interview that they don't know anything about dev ops, and I kind of carried that portion of things for us. They had literally nothing to offer to the conversation, so it was unprepared me interviewing a highly competent dev ops pro. After 20 minutes, the other interviewers left and said, "I really don't know why we're here. I don't understand any of the words either one of you are using, so just carry on without us."
I felt really bad for that guy. He was really really good, and the only thing we got out of it was that the team didn't know why we were hiring for that position.
The amount of variability is truly insane. I hope that whiteboarding culture doesn't win the standardization contest because even worse than in the case of a violin audition, there really should be no performing or bravado in an engineering team.
But even standardizing around that would be an improvement over the utter chaos right now. Because then it's consistent. Perhaps misplaced, but at least consistent.
I wish that instead of teaching classes about how to win the CS technical interview, schools would teach classes about how to interview for technical roles. How to design effective interviews that will help find the best person for that role. There's a ton of research around this. It's not just "whatever seems to work for us." And we need to be better.
I wish that instead of teaching classes about how to win the CS technical interview, schools would teach classes about how to interview for technical roles.
Aside from security, I think this is the biggest challenge we face as an industry. Everyone interviews differently and almost all are done poorly. But the vast majority think they are good interviewers, and they're just not. It also won't improve on its own, because there's almost never any effort at honestly evaluating the hiring practices or decisions.
Good interviews involve problem solving from first principles. In CS it is mostly algorithms. Maybe some systems design and/or os and network fundamentals.
That being said, to prep is better than to not prep.. just like the SAT. The problem is that these things favor those who make the time.
1 credit is about 60-80hrs max of study right? Seems reasonable to spend that much time preparing for tech interviews if you wanna go into my field. My friends who weren't smart and too lazy to put in effort spent that much probably
It’s mostly a copycat culture. But the best companies today are getting smarter and using more realistic interview techniques that accurately reflect the work.
I’m also working on the space with www.onsites.co to solve how broken the interview experience is.
I think this is terrible. I feel there are many good engineers that can't possibly compete for decent jobs in tech companies because they haven't spent 6-12 months reading cracking the code interview, doing this Stanford course, practicing mock interviews in crowdsourced websites, getting their leetcode rank up, attending bootcamps where they pad their githubs and practicing any single interview question they can scrub off glassdoor. All things that are marginally useful in the job at hand. And this is getting worse all the time.
I work in the public sector. There is a very strong "this must be fair to all applicants" bend. The same criteria is used for all applications - in particular, the exact same questions.
As part of the application, I was instructed to provide 2-3 paragraphs on each of half a dozen questions that covered various aspects of software engineering. The in person interview asked a predefined set of questions about the material that I had provided. That part was more of a "demonstrate that you have the mastery of the material claimed in the written portion and that it wasn't produced by someone else or some other source". This also tested communication skills.
No weight was given to GitHub contributions, hacker rank, leetecode rank or whatnot. There was no whiteboard.
While this isn't a fabulous job at a tech company, it is one where good engineers can and do find themselves at without needing to have the proper shibboleth to get into one of those tech companies.
For another job application that was more of a sysadmin/programmer bend, a simple backup script was the assignment (took about 1h to get all of the edge cases). A portion of the interview was a demonstration and review of the code (in which I had to answer questions about the code that I had written).
While mock interviews can be helpful in the communication skills department and reducing anxiety, there are many other ways to test the person rather than use a proxy such as contributions to a public repository or foo rank websites... and also without resorting to whiteboard for various algorithmic tricks that you either know or don't know.
It's really not that hard. I suppose there might be a few companies with insane expectations, but most interviews don't require that level of preparation.
I don't know. If all the candidates are that well prepared, then if you are not you are at a disadvantage, even if it didn't used to require it. And at companies where people want to work for, like Google/amazon/FB/Netflix, etc. and startups many people come prepared.
No matter what system you use to interview people, if the position is desirable enough, and the number of positions is smaller than the applicant pool, isn't gaming the system the expected outcome?
If people at Google were interviewed by juggling raquetballs, there would be books, courses, etc. on how to juggle better. People are willing to pay for this prep because they want to work at Google. Eventually most everyone interviewing at Google would be very well prepared to juggle.
Sure but if that were the case wouldn't you think it's silly that you need to train months to be able to juggle to land a SWE position? wouldn't it be stupid? the current interview culture in those companies is not as silly as juggling but is also not a fair representation of candidates, with too much weight being in algorithmic puzzles and behavioral checkboxes you can train for.
Obviously if the process was a 100% fair assessment of knowledge and potential and people do better than you then good for them.
Yes, it is silly that you have to train for an interview for months to land a SWE position at a top company. However, until someone comes up with a less gameable system or a magic wand that stack ranks applicants, then I don't see a viable alternative :\
As someone who gives a lot of interviews, I'd love a better way to assess a candidates ability that is less gameable, but we haven't found one yet.
Google is special in that they seem to have a generic SWE interview unrelated to the candidate's actual specialization. So everyone at Google must be algorithms/systems people, even if they focus on say, HCI or PL or whatever. For junior people, I guess that's fine, for senior people they are obviously going to bias the pool (which explains how Google in general can be so great at A and not so great at B).
Other companies are much more specific in their interviews...you actually get a specific JD with expected skills that you hope will form the basis of your interview, and not all JDs require you to be a distributed systems/algorithm wiz.
It is presumptively discriminatory unless you do a formal study documenting that high-IQ employees perform better than low-IQ employees. PG&E and the NFL use IQ tests in hiring.
I just paid program with people on the actual work we have that day. The only way to game that is to become great at what you'd do once you got the job.
As further described at the same link, research quite consistently shows that the effect of preparation on SAT scores is small, and...
> One of the most remarkable aspects of this line of research has been the lack of impact it has had on the public consciousness.
Again, this shouldn't surprise you, because it is a design goal of the SATs not to show an effect of preparation on scores, and they do their own research to ensure that preparation in fact does not have much of an effect.
All you mentioned still might not be sufficient if people don't like working with you or if you are so good that interviewers feel threatened by you regarding their future career prospects if they let you in. Happens all the time.
Could there be a clearer sign that the quality of higher education in our time is in steep decline?
If, one hundred years ago, Oxbridge had floated the idea of sanctioning a student to embark on a program of study designed to anticipate questions that might arise in a job interview, it would have been as preposterous as suggesting that the Queen should take her meals in a pub.
And yet today, we have one of the preeminent universities not just of the United States but of the world offering a course that appears to amount to Kaplan for a tech job.
Oh, come on. I think you are exaggerating quite a bit here. This is a single pass/no pass unit out of 180 units required for graduation, and, as a Stanford CS student, I can assure you the other classes are extremely rigorous. I don't see a problem with a one-unit class designed to help build confidence and demonstrate your skills to an interviewer. If you think that such a class shouldn't be necessary, then blame the current interview system, not all of higher education. Higher education has its problems, but this is hardly grounds to claim it's in "steep decline."
> Nor does any med school offer "How to ace the USMLE".
Some medical schools will use NBME (National Board of Medical Examiners) subject exams as final exams for various third year clinical rotations. Through a combination of experience on the rotation and studying material from textbooks and review books, the exams serve as good preparation for the USMLE step 2 exam since the test style is largely similar.
Yes, but good law schools aren't squandering the value of their most cherished currency on them - the reputation of their course credits in the world of academia.
40 years ago, my mom took a one credit course in undergrad for her cs degree that was on resume writing, interviewing, and negotiation. Is this so different?
I will never subject one of my potential hires to this nonsense. These concepts are valuable and academically interesting, for sure, but for the practical kind of engineering that's done at most companies it's simply not needed. Either that or it's already implemented in a library.
I'm sympathetic to this viewpoint but looking at the cost/benefit for a Stanford student I think the benefit to studying for these interviews is clear - its a bit of a simplification, but for Facebook if you can do 3-6 problems over the course of 2 60 minute internship interviews you have the ability to get a $75k-$100k signing bonus if you convert to full time in addition to more shares over 4 years. Granted this depends on whether you're a high performer during the internship but I definitely understand why someone would make or take this class.
I recently got a promotion to management. I have different responsibilities, but I get paid more and have more influence over solutions to interesting engineering and technical problems. This role feels even more like what my engineer friends in real engineering disciplines do than my role as Lead did: leverage their education and experience to solve complex, large scale problems. These interviews are good at testing for jobs in software that have as an analog jobs in real engineering equivalent disciplines consisting of computing CRC integrals by hand all day.
What I'm saying is: unless these students just want to sit around repeating their DS&A course(s) over and over again the rest of their lives, this sort of interview, job, and prep course is counterproductive in the long term.
It's very tempting to say this. However, trying to find an alternative that is not equally or more gameable is very challenging.
Take home assignments are easy to cheat on and take a lot of time.
Asking questions about someone's experience is a great way to find someone who is a great conversationalist that can't code.
Asking someone to program at a computer in a limited amount of time falls victim to the same issues that the competitive programming, whiteboard interviews currently do.
Take home assignments are easy to cheat on? What does that mean (in the context of a take-home problem as an interview method)? Are candidates expected to derive solutions to coding or engineering problems themselves from first principles as their day to day? The answer is "no" almost without exception, even in serious research or hard core low level systems programming jobs.
Are candidates expected to derive solutions to coding or engineering problems themselves from first principles as their day to day?
Maybe not, but the ones who lack the ability to do so will also tend to be the coworker who constantly asks you and others inane questions, dragging down the team's productivity as a whole.
"Are candidates expected to derive solutions to coding or engineering problems themselves from first principles as their day to day?"
Does performance on these tasks correlate with performance on the job? Is it easy to measure this performance in interviews reliably? Then it's going to be used by companies.
Any studies that support the answer "no" for both questions?
Technical interview performance has ~0 correlation with performance on the job? I find it hard to believe. Is the mean performance on the job of a software engineer that fails to pass Google interviews as good as one that does? Why do most companies want to hire Google software engineers if that is the case?
As for my second question, it should have been "is it easi-er to measure this performance reliably than other skills that may be relevant"? It sounds like this would be hard to answer. But this kind of interview seems to be best suited for repeatable evaluation (although, of course, you do have the problem that people will prepare and bias the results, but I don't see how you can avoid that if you want a repeatable way to measure performance).
Finally, "and companies use them in spite of that". Any guesses why?
> Any studies that support the answer "no" for both questions?
Google had a study that has been linked to death around here which showed the scores hires received on their interviews didn't correlate to job performance ratings. There are of course biases and flaws in that study, but it underscores another separate point: studies should show they do correlate, not the other way around (prove a negative).
> Finally, "and companies use them in spite of that". Any guesses why?
Bandwagon effect, cargo cults, ego boosting/confirmation bias from applying a hazing ritual, laziness all come to mind.
It means that you have a friend or someone you pay solve the challenge and prep you very well on the solution. This isn't about creating the universe before you bake an apple pie, it's about preventing a mediocre software engineer who can understand a great solution, but has a hard time writing one from being assessed the same was as a great software engineer.
Okay, allowing for that, nothing this class teaches does the job of allowing you to assess someone as a great software engineer either.
In fact, if you want to hire a great software engineer, you can't do that reliably with any of these interview techniques. If your interview can be studied for, you aren't going to fight great software engineers. At best, you can find someone who will do the job.
Take home assignments can take little time or a lot of time. Interviews that consist of round after round after round after a bunch of phone calls take a lot of time too. I actually prefer a home assignment of similar length.
I don't believe cheating is such a major concern. The applicant should be able to talk about their solution and all the choices they made if they actually wrote it and understand the solution.
The on-the-spot technical interview is the biggest waste of time in the history of humanity. Everyone underperforms in technical interviews. I've conducted a number of them and I've never learned a single thing other than the fact that people make surprising mistakes when under pressure in contrived situations. Here's something that I think works much better: come up with a novel challenge that requires a couple hours of honest effort and have candidates present their solution to you when they come in.
How do you prevent cheating? Someone can very easily have a friend complete the challenge and walk them through it in a lot of detail. They could prep well enough that you hire someone who is massively under-qualified.
Then if your company is large enough, your novel challenge gets leaked and everyone knows it ahead of time. Solutions start being sold or distributed. Then your interview is meaningless.
I'm all for a viable alternative to the whiteboard interview, but I haven't encountered one yet.
If your challenge is novel, as I said, then it should be difficult for them to cheat. All the same, people are welcome to cheat. It should become clear that they've done so when you're interviewing them if you ask the right questions when they're explaining what they did. And if they somehow make it through anyway, you should be evaluating the quality of their work until they get established in your company. If they're underperforming over the first month, you should be showing them the door.
Any challenge that doesn't change across applicants at a company that has a large enough applicant pool, will eventually be solved and prepared for so well, that almost any question you can ask will be prepped for.
Beyond that, if you do use a single question, you're now playing a game of whack-a-mole: I used to ask this question about the challenge, now I can't because it is leaked and people prepped for it. How do you scale that across an engineering organization? Do you have weekly meeting with every interviewer informing them of the questions about the challenge that are now blacklisted? It's going to be infeasible for any reasonable sized organization.
The challenge with this idea as well is that the risk of false positives is much higher than the whiteboard interview. The current interview process is designed around minimizing false negatives.
False positives are really costly to an engineering organization. It's not good for morale when engineers are being hired and fired in the same month on a regular basis. It's takes time to onboard, and slows the team down. It's also not good for the company reputation - would you apply to a company who had a reputation of firing a good chunk of its new software engineers within the first month?
And any challenge that is no longer novel no longer requires "top performing engineers" to work on.
A big part of the interview problem is that everyone wants to think that they are solving really important problems and are changing the world, etc.
Companies need to be honest about what they are doing and who they need to do it. Sometimes you have a hard problem and need a specialist or a brilliant problem solver to conquer it. Most companies need middle-of-the-road people who are reliable and consistent but not necessarily geniuses to maintain the things that have already been built. The idea that everyone needs to be a rockstar or a 10x-er needs to die. This overly inflated notion of what even some of the top tech companies are doing is the reason for all of this craziness.
Extremely few genuinely hard problems are solved by profitable companies. Some exceptions happen. But they happen(ed) in the R&D sections of those companies. Bell Labs, Google, Apple, Intel, Facebook-ish. But those are not the money-making day-to-day jobs that power the profit engines of those companies.
Those think tanks do excellent work, but they are not significantly different from research PhD programs. They just pay better. It's always been the .0001% of top people who solve problems theoretically, and then lesser people find ways to implement those solutions.
For any genuinely novel or hard problem there are ways to suss out how much a person has been coached, and how well the problem is truly understood. You don't have to change from week to week. It's not that hard.
In my own field, I can give a take-home challenge: here's a relational dataset. I want to work with this as a data warehouse. Give a rough outline of what changes need to take place, and how you would design the warehouse. Ask questions as needed as you go through the exercise, and be prepared to defend your design choices at the tech interview.
In less architecty roles, perhaps I'll ask a person to do some pretty basic data transformation as a kind of a fizzbuzz. But the tech interview is just to ask a series of questions that are pretty open-ended but have a list of points I think are relevant and assign points based on how many checkboxes they hit. And yes, that list can and will be leaked at a sufficiently large and desirable company, but it doesn't matter. Because the checkboxes are there to remind you from interview to interview what you were looking for. They are not a replacement for your assessment of the person. And you, personally, can change your mind as you go.
The point of an interview shouldn't be to find out if a candidate knows exactly what you know. If that's the design of your process and the only thing you care about, then you are going to have problems. The point of an interview is to let the candidate tell you what they know and for you to have some baseline for comparing that to what they need to know for the role.
In some cases, all the candidate needs is basic programming skills and the ability to memorize stuff. And in that case, you don't need to worry about if the candidate cheated, you need to worry about how honest your team is about what it's doing and what you need as a manager or coworker.
It's like security through obscurity. It never works. You need a better process.
I mean, as long as that person is getting their 'friend' to do the work while on the job, who cares?
Like, things get out on time and shipped, right? I may misunderstand the intent of the comment though.
But if they are sub-contracting out their work AND the work is done correctly, on time, etc. then it's all gravy. Yeah, you should probably hire that 'friend' instead, but if the interim works, it works (speaking very generally here)
For anyone with any kind of reasonable background suggesting they can do the job, the alternative is simple: just give them the job on a probationary basis. Give 5 people the job, tell them one will keep it after a 3 month contract period. Choose the ones you want to keep.
This is a terrible idea. Now if someone wants to interview at your company they have an 80% chance of being unemployed when it is done. Good luck getting good candidates to jump through that hoop.
Such short gigs that are easy to get into could actually be good for a lot of people.
Right now it's often "all or nothing", meaning companies are very careful about who they hire and some people just have a hard time getting to prove themselves in the act. Even if they don't get the job, a 3-month paid gig is better than unemployment.
But it really depends on what the position in question is.
How is this conceptually different than a course that focuses on standardized tests like the SAT? Having worked for only almost 2 years after college graduation, I can still see that there are many more technical and non-technical skills that affect one's and his/her colleagues' performance on the job. Being nice and helpful is one. Constantly learning from mistakes and rapidly iterating while adding value to the team/company is another. More examples: Picking up technical books, writing toy projects to learn new languages/frameworks/libraries and being able to come up with good engineering decisions. The list goes on.
Not to be too salty about this, but I remember friends from college who are terrible teammates (no significant contributions to group projects, barely understanding the project material, but knowingly do so cause others will finish it for them etc.) but go on to get job offers from the Big Four. Having these technical whiteboard interviews will not filter for such soft skills that are needed to succeed in the workplace.
To everyone bemoaning the introduction of a class such as this, it's a 1 credit class, which at Stanford roughly translates to about 3hrs/week of work given that you're expected to take 15 credits per quarter. It is comparable to a class on financial responsibility or golf. Its focus isn't so much on how to crack a technical interview than it is on how to prepare for a job interview in general.
Mock interviews, resume writing and critiques, life at startups, how to answer team/behavioral questions, and panels of employers and employees are all covered with sample whiteboard/coding problems only being given one week of attention. This is simply preparation for how to professionally navigate the job landscape.
I know this has been said a billion times but the CS interview is bullshit. It's like asking a pianist to describe how they would play a composition on the white board.
I always thought that this is a good question to ask
https://github.com/alex/what-happens-when
but then again probably not for all roles. It's good because even if the interviewee has seen the question, you can really gauge how much they know.
I kinda like the idea of having an intense data structures and algorithms class for hands-on problem solving.
If a candidate has shown an ability to solve several hundred medium+ DS & Algo questions on the spot, that gives him an edge up in day to day to programming. Not everybody can do that.
You can supplement that with other areas (concurrency; distributed systems; execution latency, memory management etc; quirks of and design ideas behind your favorite programming language).
I think this class can be a good prep for day to day programming.
I hate the interviews as much as anybody... but really I wish that kind of instruction had been a bigger part of my college curriculum. Yea, we had data structures classes, algorithm classes - but none of them really made me good at any of them. Not like how math classes makes you good at the math, by forcing you to solve problems over and over, until it becomes nearly instinctual.
I've actually become a better programmer by practicing interview problems, because they ultimately are a bit like the arithmetic involved in solving more large scale problems.
Maybe it's just me, but this seems to either miss the point of a CS interview, or makes a point about CS interviews containing pointless parts making them trivial. If an interview contains a known set of possible questions and you simply remember the answers, than in the end the whole exercise is somewhat pointless, isn't it?
Yes you are right. This is why I take some standard problems but change things a little bit so they have to think a bit. And I ask additional questions like how would you test this function, how to change improve performance, etc.
Yes. People who claim that typical interview processes test "coding" or "CS fundamentals" or "algorithmic thinking" or whatever basically have to face the fact that they've really been testing for a set of skills that's different from what's needed by a working programmer. If they were testing for the actual skills of a programmer, there wouldn't be a cottage industry of books and classes and training materials to learn how to do "interview coding".
As crazy as these interviews are they have to be working right? Some people must be passing these things otherwise there would be a lot of unfilled positions.
So much for there being a staffing shortage in our field. These types of interviews imply there is a much higher bar that exists to weed out all the chaff due to oversupply or companies won't settle for anything less than the most academically best people.
> Some people must be passing these things otherwise there would be a lot of unfilled positions.
The point is - there are a lot of unfilled positions. Companies notoriously complain for not being able to find "good" software specialists, because... people with minimal clue on how to judge relevant skills introduce increasingly more cut-throat filters and processalize everything.
Not everyone is (or pays as well as) Google. Many companies don't have a bar as high, there is still a lot of work for B and C players to do (albeit, that is probably true even at Google).
My school does co-op's and has a 1 credit co-op prep course. We do have practice interviewing there. We also do resume workshops, general best practices for the industry you're in, etc.
You can do both the content and the prep, as I am sure is the case with Stanford CS.
While it's unique to have a course specifically named for this and am sad to see it as well, I don't think this is as big as people are making it out to be, nor is it a canary in the coal mine for higher education. To me, this only signifies the growing shift in education, as it becomes more available to all, being about preparing for your career as much as learning itself. The simple fact is that most attending college today are not doing so just out of the desire to learn alone, but also want to, well, have a nice career in their respective industry. To be able to make a living.
While I would also agree with cautioning about education for career alone, I think it's important that modern education focuses on both. It's the reality of an increasingly college educated world.
Hey it could be worse, people could be getting a degree because they need it for signaling purposes and/or to satisfy an arbitrary immigration bureaucracy and don't actually need any of the things they're being taught. Oh, wait :)
Just wait until you start interviewing at even crazier companies than FB/Google that would like you to tell them 20 ML algorithms, 20 computer vision algorithms, compare expected results of non-linear optimizations applied to some weird Deep Learning problem and analytically justify it (why is it worse/better than SGD on this loss function?), then off to some deep domain knowledge, conjure graph algorithms on the spot on problems you've never seen before, all in 1.5h, expecting Stanford/MIT PhD-level answers.
Sharing a data point: I've interviewed with Google multiple times and at no point during the interview was I asked a question that required an ML/CV/DL algorithm.
I suspect those would be more appropriate for a 'data scientist' / 'researcher' position - for a generalist, the interviews were very standard and not crazy at all.
In hindsight, if I had taken the time to review basic CS algorithms ahead of the interview, I would have done much better.
That said, I'm not sure I'd be a better employee at Google if the only difference between my being hired or not hired is the fact that I reviewed some basic data structures and algorithms.
If a stanford CS degree doesn't otherwise prepare students properly for a tech interview, I'm going to wager it is the interview process that has a problem, not the stanford CS degree..
If there is whole course on "interviewing", in or of top schools no less, then the interviews really became divorced from any meaning. That should be seen as major red flag that industry is doing something wrong.
Maybe there should not be cookie cutter interview at all, but more of harder look at which skills and personality traits are really needed for this or that position.
Don't programming bootcamp classes do this kind of thing too? I feel like this just increases the probability for a student to do well on the interview. If you read their intro slide, it clearly states "Level the playing field between applicants who may
learn these things through informal mentoring in their
friend network, and those who would otherwise not
know this stuff"
It's kind of sad that there's a trick that needs to be learned to solve technical interviews without testing the candidates' inherent knowledge, but at the moment, this could seem like the most logical way to get a good job for a Stanford CS student with no knowledge on technical interviews.
Shouldn't we focus on creating great engineers instead of training them to make a good impression on an interview?
I think is completely immoral. Becoming good at something is a matter of consistent practice that brings experience and expertise over time. There is not other way around it.
People might pass the interview but their hat will fall in a matter of weeks. This is PUA for interviews by a University, this is really depressing.
The only thing that gives me hope is that great engineers will always solve problems no matter what. Tech hiring is going to become much harder.
This is a _single_ pass/no pass unit out of 180 required to graduate. Stanford absolutely focuses on creating great engineers (or computer scientists, for those more interested in the math or theory sides), but it still helps to have a small course pointing out tips and things to look for in the interviewing process.
We hear there's a shortage of good people in SV / USA. What if you plainly ignored all companies that whiteboard? There should be plenty well paying jobs left.
disclaimer. a major tech company whiteboarded me twice (codility + live coding via google docs). They were really selective, none of my friends passed. In the end the job was lousy so ultimately the test selected more for willingness to jump through irrational hoops than productivity.
This kind of prep has been common place in other desirable fields for many years. Law, medicine, finance. They all have books, classes, etc. that you can take to prepare for interviewing, taking the bar, mcat, etc. Compared to college admissions, the CS interview process looks downright sane.
They all have prep for everything but interviewing. An interview to get a residency or a junior assistant position will be much more about your credentials and a history of your performance.
Finance at least has prep for interviewing for investment banking, trading, and consulting. When I was in undergrad it was common to have mock interviews and prep for commonly asked interview questions. Most people prepped on being able to walk you through basic financial statements, valuation, brain teasers, case interviews, and stuff like that.
I'll take your word for the other fields as I can't speak directly to interview vs exams there.
The last two places I've interviewed and was turned away would have loved a "software engineer" who had taken this class. I am a typical developer, I can get shit done but there are rough edges I have because I'm human. I think, interviewing in general is tough for me and when there's a set way of showing your abilities comes up, I'm not great at presenting that.
My github is active, I work on things that a potential employer as well as fellow developers would be interested in and I also am trying to grow by showing bad, good, and some code that I'm very proud of on my github. But, the thing I noticed at the two aforementioned interviews was that they hadn't even taken a look at my github.
I'm glad they are teaching students about resume writing, and hopefully the professors are versed in hiring outside academia. I write hundreds of resumes a year for my clients, and a high percentage of young clients are using really poor resumes when they engage my services.
It seems many new grads are being drawn to highly-visualized resumes with fancy templates, often featuring a bar graph for skills ranking (some ranking themselves as "8 out of 10 in JavaScript", which comes across as laughable and out of touch).
My feelings on memorizing data structures aside, the resume lessons may at least help some grads get interviews that they wouldn't have otherwise, and that's a good thing.
I understand this encourages an imperfect system, but I would have really appreciated this as an undergraduate who didn't understand how interviews worked.
While the system is admittedly problematic, does it really come as a surprise that for extremely competitive jobs at the entry level, such artificial hurdles need to be used to filter the plethora of black-box applicants? I can't speak to more senior roles, but for entry-level, this only seems like a natural (and practical) response. It happens everywhere, not just CS:
For any competitive school admissions, SAT/LSAT/MCAT what-have-you are all poor indicators of student performance, and boil down to trainable tests.
In finance (both quant finance/and positions like sell side banking, private equity), people constantly practice math/financial modeling/accounting, for interviews. Many of the more selective quant shops (for undergrad hiring, I should mention) have much more difficult problems than MS/GOOG/AMZN technical interviews, and I would argue the type of competition level math/probability are just as unrelated to their jobs.
In management consulting, people spend hundreds of hours practicing for case study interviews. Just search consulting interview prep, and you'll find an egregious number of prep companies.
Some people brought up law and medicine. First, I don't think the grueling nature of the medicine certification is even close to what it takes to be an entry level developer. In law, while "technical" interviews are rare, often times employers filter you by much cruder means: pedigree (both the school you attend, and at top echelon law firms, clerkship status, and top-top grades within those schools). Is that any better?
The common denominator here is the high demand for all of these schools/jobs, and the problem always seems to come down to, how to optimally recruit -given- resource constraints. Maybe I'm cynical but almost any reasonable entry-level interview can be "gamed" or trained for, and most firms are either small and lack resources to extensively evaluate their applicants holistically, especially if they're only in undergrad, or they are large and get way too many applicants, so some proxy is necessary.
So, Stanford grads can't pass interviews without training for it. Looks like a degree is really worthless. Al that matters is memorizing these problems.
As other commenters alluded, interviewing and coding are indeed two different skills. However, contrary to some of the comments, I don't think there is anything wrong with preparing for an interview or even taking an entire course for that matter. If it helps you shine and showcase your real skills to an employer then why not? I’ve seen many brilliant software engineers get rejected for positions cut out for them only because they were so nervous at interviews that they had trouble even understanding simple questions until right after the interview ended. I know that because I'm a co-founder at Pramp.com, a free p2p mock interviewing platform for software engineers. It's incredible to see the improvement in one's confidence even after a single practice interview.
I've been through the ringer quite a few times now (though less than the average Stanford student) and I think about these interviews more than I probably should just for my sanity.
This is a decent primer (and people need primers!) and a good resource overall but I'm very skeptical that this is terribly helpful for most interviews. I liked some of the problems but most are about a notch lower in difficulty than required for the offers Stanford students get.
With that said, it's clear that the students that take this class are going to do well in their internship and job search. I'm honestly just becoming more convinced that ability to perform in these interviews is an inherent quality of top students (no matter the original field!) at this point.
This may not have been a serious question but I'll give a serious answer: depending on the interviewer, you could get bonus points for going "above and beyond" or, yes, you could be penalized for "not reading the spec."
I've found that the best way to handle it in a conventional whiteboard interview is to mention it when appropriate and offer to add the functionality later.
"Now, here's the part of the code where I'd deal with odd N. Since we're guaranteed even N in the problem statement I'll skip this for now, but we can come back to harden this function later if you like."
Maybe. All other things being equal, they might need a reason to cut you, and hire the other guy.
If you failed to complete the problem or took longer than expected to complete it because of extra complexity entailed by a more "correct", out-of-scope solution, it may especially cost you - even if in the end, your solution was "smarter".
Maybe the other guy noted that there are a couple of ways to solve the problem, and asked a clarifying question about it.
Not all jobs require this preparation-style. If you don't like it - shop around and find companies with different styles that match your preference. Interviews are equally important to the interviewee as the interviewer.
Do people actually put "objective" on a resume? It seems laughably hackneyed. My experience is that something near 40% of people have it, and it usually is the most trite sentence possible.
What a joke, but I'm completely unsurprised. Prep for the SAT, prep for jobs, prep on the job - minus the whizkids, Stanford grads make great corporate drones.
This is a natural response to the incentive structure that has evolved. But fear not, the fact that this metric is being overemphasized to this large a degree surely must mean that the adversarial species (the industry) will respond by changing their strategy. The more universities focus on vanity metrics (helping students build their peacock tail), the more likely it is that the industry realizes the kink in their armour and shifts to some wiser signaling measure.
I do somewhat blame the whiteboard culture, but I guess that's the one that has given the best results so far. I've no doubt that it will change soon.