I recently read, "I will not do your tech interview." by Ike Ellis and it got me to thinking about the interviews I have been in, both as an interviewer and an interviewee.

The article resonated with me initially as I thought about an in-person panel interview I once sat through.  The interview consisted mostly of brain teasers and had very little technical substance to it.  For example, I was asked "If you had to determine how many cars were in the world how would do it?" Still to this day I think this is one of the dumbest questions I have ever been asked in an interview. My initial response was that I would google it.  This only upset the interviewer because the whole purpose, in their opinion, of the question was to see how worked through problems. My appeal to them was that it was a question I would never encounter while working for them. The question was out of my wheel house as I knew very little about information measurement and market research gathering.  The interview reached it's pinnacle when I was asked the infamous "Eight Balls" question that is most notable for its use by Goldman Sachs when conducting interviews.  Needless to say I left that interview and called the recruiter to tell them I was no longer interested in the position.  I have no idea if they would have offered it to me, but the bottom line was that they had devalued their company during the interview and I no longer had a desire or interest to work with them.

So Ellis article earlier noted then got me to thinking about the other side of the table and what I have done when I've been responsible for interviewing potential candidates. When I first started conducting interviews for development positions I compiled a list of questions that I felt assessed  core technology to our product.  They weren't product specific and quite frankly they were pretty simple. One question involved two pieces of paper with tabular data on them.  The first question was about creating a relationship between table A and table B, with me ultimately looking for them to create a mapping table.  The second question was to write a SQL query that joined the data together in a single result set.  Answers were coded on a dry erase board, the least practical environment for any developer to write code.

I was once asked by a supervisor why I valued questions like the above and at first I replied, "Do you really want to work with someone that can't write a simple SQL query?" In hind sight turning the question back on him fundamentally missed the goal I was trying to accomplish.  I was charged with finding the best talent available for our company and rather then defend the process on it's merits I surrendered to the question.  The root of my problem came in evaluating or understanding what "best" meant for our company.  I simply didn't know what we valued in developers and that lack of understanding created a flawed interview process.

I don't believe that "best" is the same in Company A as it is in Company B or that when Company C is hiring for position R it's the same as when they are hiring for position S.  That needs to be said because it's too easy to simplify the solution to "best" as hiring the most knowledgable Senior Developer in the given talent pool.

Over time my questions proved to me that there aren't nearly as many senior developers around as management wanted to hire. So how do you fill gaps in technical leadership on a team?  When you can't hire them, you make them (this assumes company buy-in on this philosophy). This in turn shifted my thought process from my quiz style assessment to a project style assessment.  Perhaps I have my work with Project Foundry to thank for this revelation, but the bottom line is that I reached a point where I felt like I would gain better assessment data by looking at practical project based work instead of the results of a quiz.

My project based interview process involved a specification, poorly written like many of the ones I encountered at the time (this was deliberate), and some starting code to work from.  My unrealized dream for this process was to couple it with a BitBucket repository so I could witness the progression of work through commit history.  Eventually I also wanted to have deployment to OpenShfit for a working product because that's ultimately what developers do - they deploy the code they write. Here's the bottom line, with this process I hired three individuals.  They all had different solutions to the problem and they were able to work on the problem without me hovering over them through the project.  One guy was  a Junior Developer who far exceeded the expectations set for him in the project and that turned out to be trend in his career thereafter.  Another one of the other guys quickly moved up to a managerial role mentoring the younger developers on the team. His in person interview was memorable because of the awesome answers he gave when I asked him about his design decisions on the project.  He knew his stuff and the project had established that, so his interview turned into a friendly chat about coding like you might find on a Friday at 4 when the cases of beer were being opened at the dev shop.  The remaining developer remained a core code producer on the team up until the time I left that position. I considered all three of them victories for the new process.

I do believe that skill assessment for developers is necessary. I do disagree with the way we conduct those assessments. When I took the position I described earlier I had a timed online quiz that I needed to pass to be offered a job. I didn't and still don't see any value in that form of assessment. Those quizzes eliminate the ability for feedback loops and they dumb down computer science to multiple choice, which simply doesn't happen on real projects.  My code questions involving a dry erase board for writing code also jeopardize a hiring managers ability to get good talent.  Most developers simply don't code in front of other people and even fewer are confined by time in the way that we run these interviews.  Our developers work on projects, either a spec or a ticket and that's where they are forced to be problem solvers and artisans.  If we want to see how they will succeed in our companies we should give them realistic scenarios to demonstrate success.

Coming full circle we're back to the question of "brain teasers".  What value do they hold?  It sounds like even Google, once notorious for these, has given up on using them.  I have to confess I don't usually excel at these challenges.  Sometimes I downright freeze as I struggle to comprehend what it is my interviewer wants to see me do. It's like I'm trying to defuse a bomb that needs four gallons of water and all I have is a three gallon and a five gallon gas can to do it with (Die Hard 3).  I wonder if It might be helpful if we look at other industries and take a page from their book.  For example, how do Plumbers interview?  It starts with verifying that they have a license which is then bound concrete hours doing specific work.  Furthermore these positions have a license with it's own set of project-oriented assessment that is renewed and re-evaluated.  What about nurses?  They too are evaluated based upon on-the-ground experience doing actual nurse-work. There are verifications and accreditations based upon working experience that come into play. How do you decide to hire a contractor to work on your house?  You look at his past work, the projects he's completed and you use the referrals of that work to come to a conclusion about whether or not he can be successful on your project.  You don't ask a Plumber for an algorithm he'll never implement, you ask him to fix a toilet - which quite frankly just makes a whole lot more sense.

Things are getting easier for those interviewing developers.  Thanks to tools like GitHub and BitBucket you can actually see things that developers are working on.  You can also see a progression of their thought process in code commit history, a highly valuable tool for assessment in my opinion. After all, how much of what we do boils down to refactoring existing work? How better to see that skill then to run git log? Combine those tools with Cloud 9, OpenShift and even AWS and you have really easy ways to setup environments and projects that produce hearty evaluation data for potential candidates.  Data that corresponds to actual projects and projects that mirror what they're actually going to be doing for you in their role.  Lastly, a project-oriented interview process is more appealing to me as a developer too.  Technology jobs are plenty but technology workers are limited.  This means that as much as I need to sell myself as a product to you, you need to sell your company as a product to me.  Your first exposure is through your interview process, so why not make it something that excites me as a developer to come and be a part of your team?