Let’s assume we want to hire engineers who are “smart guys who get things done”.

Translating this expression to corporate-speak frequently seen in job descriptions, “smart” means “quick learners”, while “get things done” means “excellent problem solvers”.

Here is my preferred interview routine.

Before the Interview

  • the hiring manager takes the job description, translates it into a list of areas of expertise to be evaluated during the interview
  • the hiring manager talks to the interview team. Each interviewer picks an area of specialty to questions the interviewee, prepares problems to ask.
    • remind the team which questions they are not allowed to ask (in US this includes age, religion, family status and many others)
  • The hiring manager ensures interview tools and – usually white board, computer – and calendar invites. Allocating 1 hour (including a break) per interviewer is usual – sometimes longer time may be necessary. Panel of 4 interviewers is typical – one of those being the hiring manager.

During the Interview

We would like to identify and hire engineers with top technical abilities. One can think of engineer’s technical ability as

ability(t) = learningSpeed * t + experience

where ability(t) is person’s technical capability over time, experience is the experience the person already has at the time of the interview and learningSpeed is how fast the person can learn.

Best hires have both learningSpeed and experience high – these individuals are likely star performers, highly prized and hard to find.

Next best are those with high , low experience – “high promise”, also known as “potential star performers” – and low learningSpeed, high experience individuals, who fall under the category of “experienced professionals”.

During the interview

  • Each interviewer asks problems, similar to Google and Microsoft style, as opposed to chat about previous experience.
    • It is acceptable to chat about previous experience to map out breadth of interviewee’s background, yet problem solving is far higher priority and should consume most of the time.
    • Example: ask software development candidate to code problems.
    • Example: ask digital design candidate to design a circuit, simulate it and synthesize it.
  • Interviewer should always ask same problems to interviewees – asking same problems to all interviewees makes it easy to compare interviewees “apples to apples” and collect statistics for grading – explained below.

Now, let’s look at how we can assess learning speed vs experience.

Assessing K – Quick Learner Ability

  • At least interviewee – usually the hiring manager – should ask a common sense problem solving question, Microsoft style. The common sense problem should not require previous knowledge of or experience in any particular technical area. Success likely indicates the candidate is “quick learner”, ideally and hopefully regardless of candidate’s experience and area of expertise.
  • Ask a problem specifically outside of candidate expertise – something the candidate does not know. Keep the problem simple. Let the candidate take time, Google tutorials and find a solution without any help.
    • For example, while interviewing a web software developer, ask him/her to build and run an Android app that simply displays Hello World, nothing more complicated than that.
    • Make sure the candidate indeed has no clue about the subject. Make it clear to the candidate that the reason you are asking a question he/she does not know is to test candidate’s learning abilities as part of job’s requirements.

Assessing E – Expertise

Job description sets out expectations – what the successful candidate must be able to do, what kinds of problems to solve and at what difficulty level – senior, junior, intermediate. Pick your interview problems to be representative of job requirements and associated difficulty.

  • Example: software web developer engineer should be given problems on front end web development (write Angular client doing this and that), back end web development (write Node.js backend with endpoints), databases (connect Node.js to MongoDb, define data schemes if applicable), collaboration tools – version control (create repository, create branch, make pull request), issue tracking, Javascript, etc.
  • Example: hardware printed circuit board engineer should be asked problems on schematic design (design a certain circuit and make schematic ), electronic circuits, board layout (lay out sample circuit), understanding of noise in circuits, etc.
  • Example: machine learning/deep learning engineer should be given problems on probability and statics, classical machine learning, deep learning (network types, key methods, popular models, solving training issues), college algebra fundamentals, popular tools including TensorFlow, Python coding, familiarity with latest publications, familiarity with latest popular techniques (e.g. reinforcement learning), etc.
  • Example: digital design engineer should be given problems on RTL coding (code Verilog with certain function), digital design fundamentals (state machines, etc.), simulation (simulate RTL, gate netlist), synthesis and timing closure (identify paths not meeting timing), optimization for area and power, design for FPGA vs. ASIC (re-design RTL code to make it implementable, how to cross clock domains), Linux/python/C coding, hardware micro-architecture (architect a larger design), System C modeling, design for test, ASIC routability, etc.

Behavioral Assessment

Here we have to check candidate’s a host of behavioral aspects – good work attitude, ethics, integrity, is he/she a good team player, can the candidate handle feedback, can the candidate handles the stress, is the candidate open-minded or has the not-invented-here syndrome (likes only his/her ideas, rejects other’s ideas), is there an ego problem, are there potential sexual harassment tendencies? Make sure the candidate is civilized, not rude, communicates well – can explain him/herself clearly, convey points and so on.

Lastly, check if the candidate fits your particular company’s culture – the way company does things. For example, if working long hours is part of your company’s culture, check if the candidate is willing to work long hours as well if hired.

Best way to check all these aspects has been to observe how the candidate solves interview problems. Explain the problem to the candidate – is the candidate listening carefully? Can the candidate explain his/her proposed solution to the problem? How the candidate reacts to your critique of his/her solution? How the candidate treats each interviewer? Does the candidate chat and get along with the team during lunch?

When interviewing senior candidates make sure to have a junior interviewer on the team – ask the junior interviewer if the senior candidate treated the junior interviewer with due civility, as opposed to ignoring.

After the Interview

  • each interviewer scores candidate’s answers, calculates an aggregated score, – see below, – writes up feedback and chooses recommendation to hire, hold or pass. This is done before discussing feedback with other team members –  to avoid potential bias.
  • Next, team members share their hire/hold/pass recommendations with each other. If all recommendations are unanimously “hire”, usually make offer. If the recommendations are less than stellar – pass on the candidate.
  • If the recommendations are largely to hire, but one or two interviewers recommend holding or passing – the hiring manager calls a meeting to  “calibrate” the team and to troubleshoot poor interviewing.
    • Calibration means making sure all interviewers use similar expectations about the level of expertise required to get this job.  For example, if the opening is for a senior analog design engineer, excellent knowledge of analog circuit design is a must. An interviewer recommending “hire” to a weak candidate has to be instructed to only recommend “fail” candidates like that.
    • Poor interviewing means interviewer did not ask the right questions, did not ask hard-enough problems – resulting in the wrong impression that the candidate is great, while other interviewers correctly saw the shortcomings.
  • The hiring manager double-checks with each interviewer on how was the interviewee attitude – will everyone on the team like working with the candidate? If any team member reports questionable attitude, usually reject the candidate.
  • Before making an offer –  the hiring manager also asks the team to compare the candidate in question to previous candidates. Don’t make offers to barely-passing candidates – only to top-ranking ones.

Candidate Scoring

  • Interviewer grades problem answers on scale of 0 to 10 and keeps records of grades for candidates interviewed so far. Having interviewed a handful of candidates, these records become statistics – the interviewer can plot a distribution of grades for each question and see which candidate answered that question poorly, at an average level – and most importantly which candidate’s answer was well above average, perhaps an outlier.
  • To decide on hiring recommendation, grades for individual problems should be combined into a single score. This can be done by assigning each problem a weight and summing up – a weighted sum of candidate’s grades.
  • The choice of weights is decided between the interviewer and the hiring manager to tailor scoring to the job description – assign larger weights to core, must-have skills, while giving less weight to auxiliary nice-to-have skills. Keep the sum of weights to 1, so that the resulting aggregated score ranges from 0 to 10.
  • Lastly, take aggregated scores for previously-interviewed candidates and the candidate being interviewed, plot the scores as a distribution, see where the candidate being interviewed  falls in the plotted distribution. If the candidate’s aggregated score is exceptionally high and shows as an  outlier – recommend hiring.