We’ve been very actively recruiting for some time and I previously posted about the framework we’re using to assess people. At the bottom of that post I alluded to the “programming challenges” that we developed so we could get an idea of people’s technical chops. It’s turned into much more and I wanted to write about it because I absolutely rely on it and it is the best recruiting I have ever done. I should give an appropriate credit at this point, I say “we developed” but they are really the work of Andrew Datars who is my VP of Architecture, I can only claim to have planted the seed, Andrew did all the really hard work of bringing them into being.
Aside from the back-patting, the other reason I wanted to write about them was because their positioning is as important as their content. For posterity, and because I think it’ll help with the explanation, here they are:
Empathica new developer challenge
Empathica NET developer challenge
Empathica NET quality engineer challenge
I should also state up-front that we use TDD in our development work and are an Agile XP shop. These things are important to the story because we are hiring into two roles, Developer and Quality Engineer. We also hold true to the belief that we recruit against innate skills rather than learnt ones, and therefore place a much higher value on a person’s capacity than we do on their precise experience – although experience and knowledge are obviously highly valuable once acquired.
Given our hiring philosophy we realised we needed a way to objectively assess a person from a technical perspective and within a technical context. We also wanted to give some flavour of the work that we do to people who may have programming skills but completely outside our domain, or little programming experience at all – and it turns out that is a pretty decent size pool of people. We are a .Net shop and therefore wrote one for that, repeated but technology agnostic, and created a third for quality engineers which is based on a publicly available code base from Microsoft.
The process goes like this…
1) have the person in for a face to face interview
2) send them away for 10 days or so with one of the challenges
3) bring them back in and review what they have done
There’s a lot we learn through this process which has little to do with the technology:
- Do they accept the challenge and with what sort of attitude?
- How long do they take to come back with an answer?
- What does their code look like (style, separation of concerns, factoring etc.)?
- How do they respond to criticism of their code?
- How do they interact with us as developers?
- How well did they understand the requirements?
- How do they think through issues and debug the code
On top of which there is the code itself and the finished application. We position the challenges not so much as a technical test but as a topic around which we can jointly work when they come in for a technical assessment. The objective is only partly about assessing their technical skills, and almost not at all about their knowledge and experience. Instead we want to simulate working with the person on a concrete problem in a technology we use and a context which is close to our reality.
We find this gives us an exceptionally good read on the person. We allow enough time in the “interview” for them to overcome their nerves although that is an important consideration, especially if the person has only limited exposure to the technology.
We further request that they bring along a code base to which they have contributed significantly, ideally a hobby project to avoid NDA issues, and we spend the second half of the interview talking through and understanding their code.
This last part is important. We found that we could be left wondering whether problems we saw in their approach or code in the challenge were to do with a fundamental lack of understanding of coding, or just unfamiliarity with the technology of the challenge we set. Having them talk to us on their turf was a good way of finding that out. It also gives other valuable insights such as how they are at expressing concepts to people with no domain knowledge, how motivated they are to code in their spare time, how curious they are about a problem, what sort of business sense they have, and the picture they have of where technical competence sits in the commercial world.
When it comes right down to it we end up not really caring too much about the technical aspects of the challenge, the human factors being much more relevant and harder to extract through a normal interview process. We try and position it as being less about the technology and more about the opportunity to work together on some code, but as a candidate it is probably hard to see past it as technical test – which of course it is.
We have hired 3 people so far through this method and have a further 10 or so in our pipeline. The results are stunning and we are in a hiring groove which is transforming our technical organisation.
If you are reading this and want to talk to me about a job please feel free to contact me by email at firstname.lastname@example.org and make sure you mention this article and the challenges.