Rethinking how we interview in Microsoft’s Developer Division

By John Montgomery

A couple years ago, I had a series of small epiphanies. I’d just talked with my team about how we were going to change the program manager role — the behaviors we were going to work towards. Things like less focus on the backlog, more on the business; less emphasis on “knowing” and more on “learning and questioning”; and more focus on engaging with customers 1:1 at scale and less on aggregate data. We wanted to bring people into the team that would help us change to this culture, but we were still asking the same interview questions and using the same interview style. So we rethought how we did interviews and came up with something that works for us.

Now that we’ve been doing it for a while, we thought it’d be worth sharing some of what we did and what we learned.

The first epiphany came after the rollout of the change to the PM role and we noticed we were still asking the same interview questions we’d asked for the last decade or more. The questions didn’t make sense when we were trying to find people who’d bring different skills and viewpoints into the team. (Note: I started at Microsoft when we were still asking questions about why manhole covers were round, how many ping pong balls would fill a 747, and how to reverse a linked list. In 20 years here, I’ve yet to have to write the code to reverse a linked list (copy-paste anyone?) or fill a 747 with any kind of ball.)

More than that, sometimes two interviewers would ask the same basic question unintentionally. Even when we would coordinate interview topic areas, we’d lean on the same core catalog of questions across interviewers. Some questions — the behavior-based questions — weren’t terrible, but we weren’t being particularly effective in how we employed them.

The second epiphany came during a meeting. Ideas were flying fast, people were stepping on each other’s sentences trying to get their idea into the conversation, and we were quickly building up to some momentous decision. At least, I’m sure it was going to be momentous — I was distracted by a customer issue and was IMing with the account manager and engineering team to get a customer unstuck. That’s another story. Anyway, the meeting had built to a head, and one of the participants — a PM on my team who is both exceptionally smart and quite quiet — said, in effect, “I just searched the internet for information about our topic and, yeah, this idea isn’t going to work.” She was nicer about it than that. But the aha moment for me was that not everyone does well in those fast-paced brainstorming sessions. A lot of people (including me) prefer to sit with a cup of coffee and some data and try to think things through. More than that, there were nearly zero times in my career where we’d made a major decision without stepping away for a little while and looking at the idea with fresh eyes, fresh data, and fresh customer research.

But that’s what most interviews were: fast-paced, how-quickly-can-you-come-up-with-a-solution-to-a-problem-you’ve-never-seen situations.

The third epiphany came when I was talking with a couple of our engineering teams about how they were bringing people into their teams. Developer Division does a lot of open source work (.NET Core, VS Code, and TypeScript among the many projects). Our dev teams had taken to working with candidates to solve a bug or feature as part of the interview process. It was a collaborative effort with the candidate and the team working together to solve a real problem.

Since “writing is thinking,” I wrote an email to myself about how interviews could work differently for my team. I then shared the ideas with a lot of people around the team and we started to iterate. People like Karen Ng, Amanda Silver, Cindy Alvarez, Nathan Halstead, Anthony Cangialosi, Jeff McAffer, Jessica Rich, Travis Lowdermilk and many others participated, iterated, and tested.

When we were ready to roll it out, we started small and continued to learn, to iterate, and then to expand. Now this framework (which we call “the alternative interview framework” because none of us is particularly gifted at naming things) is our standard practice — one we keep refining and learning from — and it works pretty well for us.

Here are some of the things we did differently.

To start with, we let the candidate know a few days in advance what the interview day will look like and what problem we’ll be working on. We give them time to do their own research and to think about it. It’s not like going into work every day is a surprise, so why should an interview be?

Second, we run through a real problem the team is trying to solve — improving satisfaction, increasing retention, boosting usage of a service or feature. The fact that it’s a real problem that we’re working on helps foster a collaborative conversation.

Third, we give the candidate access to the same information we’re working from, and during the interview they are free to search the internet or ask for more data. We often supply the candidate with our customer research, usage data, designs and mock-ups — most everything we have.

Fourth, we make the interviews interactive. We’re not asking you questions. We have a problem to solve together, so let’s work the way we’ll work when you’re here and we’re working on this particular problem.

Fifth, we follow a single scenario/problem throughout the day and take the candidate on a similar journey that PMs go through starting with the customer or business problem, understanding the customer’s job-to-be-done, designing the solution, bringing the solution into customer hands and ultimately getting them to use and love it. Each interview focuses on a different phase of the process.

Sixth, we pair interviewers up. Rather than having one-to-one interviews, we bring two people from the team into each interview. Our original motive was to train more interviewers, but having two people in the room had other benefits. Not only was the conversation more dynamic with multiple collaborators, it also gave us an opportunity to hear multiple perspectives on the same conversation. Not everyone hears conversations the same way, so it gave us a way to place a check on unconscious bias in the same conversation.

Seventh, we hold feedback between interviewers till the end of the day. We wanted each interviewer to judge the candidate based on the merits of their conversation alone — and not the opinion of interviewers who came before them. We tell interviewers not to signal to others whether they were leaning toward recommending we hire someone or not. They hand off the candidate to the next interviewer set and give a summary of what we’d learned in the previous session. At the end of the day everyone simultaneously makes their recommendation and explains a bit about what they saw/heard that led to that conclusion.

Eighth, at the end of each interview loop, we not only discuss what we learned in our time with the candidate, but what worked or didn’t work in the process. We feed that back into the process so it gets better.

I’m sure there were some other aspects of the process that I’m forgetting, but those eight were the big ones.

Well, we were worried that candidates would be nervous. Two interviewers. A real problem that moved along through the day with real data. We shouldn’t have been. Pretty much every candidate gave unsolicited feedback that this system was both unique and really helped them understand our business and the team. Even candidates who didn’t receive an offer liked the interview process and understood why we didn’t make the offer.

We learned that we had some gaps. For example, our PM team is remarkably technical. Many of the PMs check code into production products. This makes sense for us: our customers are developers, so it’s helpful to have the kind of customer understanding that comes from creating software. But we didn’t have a good place in this process to really go deep on a candidate’s technical skills. So, we added an interview segment to have a more technical interaction.

We learned that this process is hard on standard interview logistics. As one example, since the candidate is working on the same problem and writing things on white boards that they’ll need later, we need to hold a dedicated space (a conference room or focus room) for the candidate, and the interviewers go to the candidate.

We learned that interviewing is “expensive.” By having two people in each interview, we were doubling the time and people commitment and dramatically increasing the scheduling complexity. However, after some initial complaints about the expense, everyone on the team began to see the benefits, like more people on the team meeting our potential new hires. So, we were willing to pay the price.

Ultimately, the goal of a hiring process is to bring great people into the team or company — to make sure they’re a fit and will succeed, and to create a great experience for them so they want to join. One candidate who had competing offers from us and a couple of other large Seattle tech companies chose our team specifically because she liked this process so much. She happened to be one of our early candidates during the experimentation phase and she’s still here and doing really well, as are many of the candidates we hired through this, so I’d say that we’re doing pretty well — we’re still learning, but the outcome so far has surpassed our expectations.

This story is published in Noteworthy, where 10,000+ readers come every day to learn about the people & ideas shaping the products we love.

Follow our publication to see more product & design stories featured by the Journal team.