We used to have an interview process that had a take-home project.
The project was:
- you have two files: listings.json and products.json
- listings.json lists ~20,000 Amazon listings for electronics with fields like title, brand and price
- products.json lists ~1,000 digital cameras, with fields like brand, family, model
- your job is to write a script that, for each entry in listings.json, emits the best match (or no match!) from products.json
Your solution could be as naive or as fancy as you wanted. The main point was to have something for the next stage.
We'd run your submission, and use that to show you some false positives and false negatives. Then we'd ask you to debug why each false positive or false negative happen, explain it to us, propose how you'd fix it, and identify any trade-offs in your proposed fix.
Eventually, we wanted to offer a non-take-home-project interview. We already had a bunch of existing solutions from employees, so we used those to run a stripped down version of the interview that just focused on the code reading/debugging/proposing fixes part.
I think both of these interview approaches were pretty effective at giving candidates a natural environment to demonstrate skills that they'd actually use on the job -- debugging, collaboration, predicting downstream implications of changes, etc.
That's a great example that fits almost 1:1 with something we've been doing. I will propose that to avoid having to give take-home projects, thanks for sharing!
The project was:
- you have two files: listings.json and products.json
- listings.json lists ~20,000 Amazon listings for electronics with fields like title, brand and price
- products.json lists ~1,000 digital cameras, with fields like brand, family, model
- your job is to write a script that, for each entry in listings.json, emits the best match (or no match!) from products.json
Your solution could be as naive or as fancy as you wanted. The main point was to have something for the next stage.
We'd run your submission, and use that to show you some false positives and false negatives. Then we'd ask you to debug why each false positive or false negative happen, explain it to us, propose how you'd fix it, and identify any trade-offs in your proposed fix.
Eventually, we wanted to offer a non-take-home-project interview. We already had a bunch of existing solutions from employees, so we used those to run a stripped down version of the interview that just focused on the code reading/debugging/proposing fixes part.
I think both of these interview approaches were pretty effective at giving candidates a natural environment to demonstrate skills that they'd actually use on the job -- debugging, collaboration, predicting downstream implications of changes, etc.