If not algorithms then what would you use to test a generic programmer? You would have to setup a custom panel based on each candidate. A person who has only worked for PHP would need a different panel from someone who has worked on C++ all his life. Algorithms is language and framework agnostic. It doesn't matter if you work in embedded or data science. It doesn't matter if you code in C or Python. The same question can be used to judge everyone.
Companies that don't do the above usually advertise requirements for X technology with Y years of experience. A smart programmer without knowledge of X can learn it easily, but he won't even be called for an interview.
Programming in general is language and framework agnostic. Someone who can write an extensible and maintainable C++ application can most likely do the same in C#, Java, Rust etc. There's decades of best practices and knowledge around what makes good software and almost none of it is specific to a language.
Most devs use a handful of algorithms in their entire career. And any algorithm they do use was probably adapted from pseudocode on Wikipedia.
It really comes down to what do you want the developer to do? Are the things you're testing them on even relevant to the job?
I'd argue what most companies are testing through their interview process is largely irrelevant.
If it's a generic programmer, then it's perhaps a position where's there's a range of technologies, maybe a range of languages to pick up and a range of existing software to support. Well in that case you want someone who's quick to learn and eager to train themselves, not afraid to ask questions, not afraid to admit they don't understand something, etc. Whether they can write quicksort on a whiteboard is probably the stupidest thing you could interview them on.
I think the best kind of interview isn't overly technical, but rather one that prods the interviewee to think about problems. Describe a fictional system, what would they suggest to make it scaleable? What would they suggest to improve performance? What would they suggest to reduce the time between writing the code and shipping it. Do they understand what a CI/CD system is? Are their first thoughts "rewrite everything!"? Do they think about iterating software development or are they the kind to dump 200 files on someone for code review?
> Describe a fictional system, what would they suggest to make it scaleable? What would they suggest to improve performance? What would they suggest to reduce the time between writing the code and shipping it.
These questions are very senior dev focused, not so sure I'd ask a junior candidate much in this direction. But I always did a question like this when I did my share of interviewing. And the idea is basically to see if the candidate has been around the block. Have they seen the sausage being made, or do they only have academic understanding of the software engineering profession?
And being on the other side, some of the best questions I've gotten was always like "describe the weirdest problem you've encountered". Talking about unexpected shit usually leads to a fun discussion, and you can showcase your ability to problem solve.
Would you agree that interviewing in psudocode is a good idea then?
But what's the content they should be interviewed on? You say no to quicksort. How about date parsing? That shows you have to handle messy cases.
What do you want them to write if not algorithms? Maybe we have a different expectation of what "algorithm" means? The term is pretty generic, but I feel captures more of the cross language stuff.
Compare a "How does javascript handle equality?" question which falls under "trivia", vs "Implement a hash table", which isn't too hard to do for a naive solution, and demonstrates that you know how the basic data structures you work with day to day work.
>But what's the content they should be interviewed on? You say no to quicksort. How about date parsing? That shows you have to handle messy cases.
But why? The vast majority of programmers don't ever need to write code to parse a date. It's just another test completely irrelevant to the job.
>Compare a "How does javascript handle equality?" question which falls under "trivia", vs "Implement a hash table", which isn't too hard to do for a naive solution, and demonstrates that you know how the basic data structures you work with day to day work.
Both are pretty bad interview questions. Is this a Javascript position? If they don't know how it handles equality do you really think that is the end of the the world for the interviewee, does that make them a bad programmer? It's something you could explain to them during the interview in under 5 minutes (and you should, it eases the pressure on the interviewee). So really what's the worth of asking it?
Likewise, "implement a hash table". Pointless. Ask them "Do you know how a hash table works under the hood?", chances are they might not know exactly how to implement one but they do understand roughly how it works. If they don't know then describe a simple implementation and quiz them on why it's built this way, do they understand the "why" even if they don't understand the "how". Most competent programmers can implement a hash table given an explanation of how it works which is why I don't think it's a good question to ask someone during an interview. It's just a memory test, not a test of how good they are as a programmer, how good they'd fit into a team, how good they would be at designing a large system, how good they'd be at weighing up the pros and cons of different solutions.
This is why interviews should be discussions, not tests.
> vs "Implement a hash table", which isn't too hard to do for a naive solution, and demonstrates that you know how the basic data structures you work with day to day work.
I disagree, anyone can write out an algorithm from memory. That's not hard, and does not prove that you know what you're doing.
I've done my small share of interviewing, and I always did a little bit of whiteboard coding, but the objective was never to see if the candidate remembered algorithms, but to see if they could execute code in their head.
Because I've encountered "programmers" who clearly couldn't execute code in their head, and whose idea of programming was copy-pasting something and heuristically modifying it until it did what it was supposed to do, instead of knowing what the code does. And you really want to make sure you screen for this group so that they fail the interview.
We have a medium-difficulty coding question which can translate into any language. It's pretty open-ended, in that there are a few different ways to tackle the problem. It's not algorithmic, but it definitely forces the candidate to flex their coding and problem solving skills. From watching how the candidate proceeds, I get a very good idea of their skill level and how they react when they hit a roadblock: are they able to debug? Do they get overly frustrated?
From this relatively simple question, I get a lot of insight into the candidate's skill level, i.e. how well do they write code? and it has nothing to do with algorithms.
HBO's Silicon Valley is dramatized, but I can't be the only one who's coded from the the convention hall floor/hotel, or had to setup servers "live". Every time a site gets unexpectedly popular (/. effect; HN hug of death) and dies, and comes back, there's someone at the other end who's probably close to tears trying to get their site back up ASAP.
That doesn't indict the fact that's the exception and that software programming almost never looks like that.
The real problem is that take home problems are trivially gameable - it's obvious to someone unscrupulous who's not a well-seasoned coder, that it would be well worth paying someone else to do their take-home-interview-homework. (Sure, that would come out eventually, but that's only one of the problems I have with take-home assignments.)
Some places have "open book" interviews, where it's like "here's a laptop, do this thing, and google/stack overflow/devdocs" your way to victory, which is at least better than the older "how much trivia do you remember about this API"-style questions. But I have no idea how to go further and make that unbound by time when an employee is on the other end of the table listening to the interviewee's thought process, if only because it makes it very hard on the interviewer to plan their own time. (Especially considering maker vs manager time.)
Have you ever actually tried a take home exercise? Or is this just based on your perceived concerns?
In the years I have been doing take-home exercises for candidates, I have yet to see a single instance of a great take home leading to a terrible on-site or hire in terms of technical skill. There have definitely been great take-homes for candidates that we later chose not to hire, but usually due to challenges interacting with the product team, or in some cases too much fixation on technical perfection with no focus on business realities, etc. If there are a large number of candidates paying others to do their take homes for them, they should be asking for refunds.
Second, even if this is a problem, there's a very simple solution. For candidates that do well on the take home, either on a call or on the on-site, do a live code review with the candidate and ask them to explain a bit more about why they wrote the code the way that they did. Ask them questions about other approaches, or what they'd do if they have more time. Candidates will be significantly more comfortable talking about their own code they wrote. It'll be quite clear if the candidate didn't write it themselves.
I had a project give me a take home exam with a link to the git repo and basic instructions on implementing a simple algorithm, as well as tests, and FFI bindings. That took a couple hours, then I had two 1-hour remote interviews discussing the code. I thought the process was pretty good: even though I was annoyed at having to do the take-home bit, it served to demonstrate that I could navigate the project source code, build files, learn the FFI interface, etc. The algorithm itself was fairly simple to implement with a little thought, although they probably didn't care if I looked up how to solve the problem on Wikipedia. Personally I think that being able to figure out where the tests were, how to add FFI, etc., was one of their biggest requirements.
Anyway, I thought it was robust but not too time-consuming interview process. I think such an interview could use a reasonably-sized open source project for this, instead of their internal code base.
Your argument is flawed in that it assumes knowing algorithms has any correlation to doing well on the job. It doesn't really matter that it's language agnostic if it doesn't test anything useful.
If I was interviewing a python programmer, I would also know python. I would ask them about a project they’ve done in detail, see if they are bsing, gauge their ability and experience from that
Companies that don't do the above usually advertise requirements for X technology with Y years of experience. A smart programmer without knowledge of X can learn it easily, but he won't even be called for an interview.