Building an Engineering Team by Ammon Bartram and Harj Taggar
As a slides of loading, there is no topic that should occupy your minds more as you build your company than bringing on the team that's going to make your company successful as you move forward. Hajin Amin from Triple Byte, YC alumnus, is going to talk about building in today's day and age. Perhaps the key portion of that team is the engineering team, so please welcome Harsh.
"You're starting yeah, Harsh."
"Alright, thanks for having us everyone. So I'm Hodge, I'm one of the co-founders of Trip Byte along with Arman. Previously, I used to be a partner at Y Combinator, and part of the inspiration of starting Triple Byte was noticing how after graduating Y Combinator and raising the first investment round, everyone's number one problem was hiring—specifically hiring engineers because the hardest hiring challenge.
So, we're working on Triple Byte, which is a hiring marketplace that's used by engineers to find new places to work. Armen and I have gathered lots of data on what works well when it comes to hiring engineers. I've personally focused a lot on spending time with companies, helping them think through their strategies for finding engineers and obviously getting the most out of Triple Byte. Arman spent a lot of time thinking through the details of how do you evaluate an engineer, how to evaluate the engineering skill and answer the question of whether they are a good engineer or not.
So, we're going to share that and sort of have a divide-and-conquer strategy going on here. The four main topics we're going to cover are: where to look for engineers and when you should start thinking about using recruiters— and that's what I'm going to start with. Then, we'll talk about evaluating technical skills, which Arman's going to cover, and then I'll finish up by talking through the process of making offers and closing, which is getting people to actually join your team.
But before we start on any of that, I want to just issue a warning and make sure you're well prepared for the fact that hiring really well truly sucks. It's like an incredibly painful process for many reasons, which I'll describe in detail. The first is it takes a lot of time. Like, it takes a lot of time just to convince someone good to even have a conversation with you. As a founder, as you know, time is a very scarce resource. There are bugs to fix, sales customers to close, various things going on, and hiring will never feel like it's your topmost urgent priority.
It's very easy to procrastinate and push it back, but if you do that for too long, you won't scale and you won't grow your startup, and someone else will come along and take the market. Hiring involves a lot of repetitive work. So actually, as Tyler was giving his presentation, I was talking to Jeff about this back there. There are a lot of similarities between sales and hiring—actually, there are a lot of similarities between sales and hiring and fundraising; a lot of things that you do as a startup founder, a large part of it is effectively selling all of the time.
And selling, as Tyler pointed out, involves a lot of repetitive work. This means hiring will involve lots of messaging and taking lots of coffee meetings, lots of phone calls, lots of interviews, and most of those will result in a dead end and be a complete waste of your time. But you have to keep going, and finally, you will get your heart broken. You will inevitably end up getting rejected by people you really wanted to hire, who would have been the perfect fit to help you hit your growth goals.
But it turns out they were never really that serious about leaving their comfortable job at a big company to join your exciting but risky startup, so be prepared for all of this. As you're thinking through building a hiring process, I encourage you to think about this as a funnel that you're creating that has three parts to it. The top of the funnel is sourcing, and that's finding people who could be a good fit. The second is screening, and that's answering the question of whether you want to hire this person or not.
The final part of the funnel is closing—making the offer and getting the offer accepted. I'm going to start by talking through some strategies for building the top of your hiring funnel. These are the five places I'd recommend that you look for making engineer hires. I'll talk through the pros and cons of each of these; they are personal networks, hiring marketplaces, LinkedIn, GitHub, job boards, and meetups.
I'll talk about how you can get the most out of each of these, and this list is sorted in order of where I think you should start out focusing most of your attention and energy down to where you focus the least. We start with personal networks. In my opinion, personal networks are the best place to hire, especially when you're early and making your first few hires by far.
The reason is anytime you're deciding if you want to hire someone, you're essentially asking yourself two questions: One, does this person have the skills that you need for them to do the job? Two, can you personally work effectively with this person? When you're a big company, you can mostly focus on answering just the first question because you're large enough. There are enough people, there are enough teams that likely someone on one team somewhere will be able to work effectively with anyone.
But when you're small, that's not true. Whether you can work effectively with someone or not is a big determinant of your success, and if you hire the wrong person early on, that could literally be fatal. So, when you hire someone that you've worked with previously or you hire someone who's worked with a person you trust, you do risk the chances that you won't be able to work well together, which is a big thing to consider early on.
That probably sounds like somewhat obvious advice, and yet I'm surprised by how often founders still don't really use their personal networks effectively when they're hiring. I think there are two reasons for this. The first is they don't use a process to exhaustively search through everyone that could potentially hire, and the second is they don't actually make the ask.
That's usually comes down to being afraid of being rejected by your friends. It's somewhat easier actually to be rejected by a complete stranger than it is to ask your best friend to come join you, and they say, "I'm not sure the idea is that great." There's also like you can also worry about what happens to the friendship if the startup doesn't work out.
There's the sort of more that goes into it when you're talking to someone you know personally than when you're talking to someone you don't. But the truth is you just have to sort of suck it up and do it. But you want your startup to be successful; hiring from people you know is a tremendously valuable resource, and you just have to make that ask.
So, I'd recommend you follow a strict process here. Start with just making a list of every good engineer you know, whether you think they're available or not—like that's actually irrelevant. It doesn't matter if they just sold their company for a billion dollars; put them on the list. Then ping each and every one person on that list to meet up and commit to asking them if they would join if you have a crazy idea. You think it is, however unlikely—commit to making the ask. If they say no, they are hesitant, so deviously a little be and say, "Will you at least come by the office and see what we're working on?"
And if the office is your apartment, that's totally fine too. Just keep pushing until you've at least shown them something that you've done. Keep working on convincing them. If it doesn't work out, if they say no and you get a definitive no, then ask them if they were in your position, who would they try and hire? Make a list and go out and repeat this exact same process with them.
And this process just never ends. I know public company startup founders—no, long startup founders, I probably know company founders who still do this on a daily basis. This is just a key thing you have to embed in yourself as a startup founder. As your company does scale and grow and the team that you start putting together, you want to start tapping into the personal networks of your team.
The way I'd recommend doing this is team events where people brainstorm potential hires, and this is commonly referred to as a sourcing party. The way I had to recommend going about this is get everyone together, send out a shared spreadsheet, and describe the role that you're hiring for. So if it's an engineering role, sort of describe in detail like who you're looking for, who are example candidates, what are skills and qualities you'd be excited about, and then literally have everyone spend 30 to 45 minutes going through their LinkedIn or their Facebook or whatever, right there and then thinking of everyone that could be a fit and putting them into the spreadsheet.
Once that's done, Triple Byte will actually then personally follow up with anyone on that list who seems like a good candidate or not. We’ve made several really great hires doing this. It works really, really well, and you can kind of make it like a fun thing to do, so we'll do it at the end of the week just before our Friday all-hands food and drink, and you can also offer referral bonuses to your team to incentivize them to do this.
So, really make sure that you're sticking to an exhaustive process, you're making the ask of people you know, and then as you scale, tap into the personal networks of your team. Once you’re sure that you've exhausted your personal network for leads, the next place I'd start looking would be hiring marketplaces. Hiring marketplaces are actually relatively new; they've become more popular over the last few years as it's become harder to hire engineers by using traditional methods like reaching out on LinkedIn or GitHub.
The way I think about hiring marketplaces is they actually work a lot like dating sites. The idea is there are engineers who create profiles and companies that create profiles, and both are advertising their best selves. You message each other and you figure out if it's worth meeting up in person, and you know if it all works out, you make a hire. The dynamics of the marketplaces are such that the demand for good engineering talent far exceeds supply, and typically it’s the companies that are being a lot more proactive in terms of reaching out first to the candidates.
The candidates are getting multiple inquiries, and the engineers are the ones that are choosing who they want to speak to and who they don't. A big benefit of using a marketplace, especially in the early stages, is that they can help you hire very quickly because most candidates on the marketplaces are actively looking for a place to move right now. It's very quick to get on a phone call with them and start pitching, and if you run a good closing process, you can significantly reduce the amount of time you'll spend as a founder on hiring, which is obviously great.
The downsides, though, are that they tend to be quite competitive. Engineers are being reached out to by multiple companies at the same time, so you'll have to be very effective at convincing them to join if you want to make hires. The second is that they can be expensive, so most marketplaces will work on a fee-per-hire basis, which can be 15 to 20 percent of the first-year salary. That's cheaper than a recruiting agency, but still a significant cost if you're in an early-stage startup.
I'm obviously biased; Aircast requires a hiring marketplace, but I'd say the three main ones that come up in conversation, at least when we're pitching customers, would be Triple Byte, Hired, and Vetri. You should try them; they're all free to use to get started, so you're welcome to try. I'd say the way we differentiate ourselves is essentially by having better candidates, and we measure that by what percentage of candidates that companies interview through Triple Byte do they make an offer to.
That tends to be twice the rate that they make through other sources. Just as a general note on hiring as a funnel, you're optimizing your funnel, so you should pay attention to what percentage of candidates are making it through each step in your funnel. When you're early, you won't have that many candidates, so you can't be that scientific about it, but start sort of capturing that data and just build that into the habits you have when you're thinking about hiring.
The third source that I recommend you go to is LinkedIn and GitHub. These are effectively the biggest online directory of engineers in the world. Most of the hiring that's done at big companies is through teams of technical recruiters reaching out to engineers on LinkedIn or GitHub, finding the ones that fit certain keyword criteria and sending them cold messages.
The reason they go for a very high-volume approach here is that technical recruiters may well send over 100 messages a day just in the hope that they get a few replies. A dynamic that occurred, especially over the last few years, is there are more technical recruiters, and there are more messages going out on these platforms, so response rates are dropping for everyone. This means for early-stage startups in particular, it's going to require a lot of your time sending a lot of messages in order to get a few interested candidates.
So, making this work for you requires, in my opinion, not playing to a high volume approach like a big company recruiting team, but instead spending the time actually researching and reading the details of profiles, reading through someone's LinkedIn, looking through their GitHub, looking at the details of the work that they've done, and sending a smaller number of personalized targeted messages, emphasizing when you send the message why would someone be a good fit for your company specifically and giving them clear evidence that you've read their profile and you're interested in them as an individual as opposed to sending a spam message.
But that doesn't mean the message has to be super long; I still advise keeping it short and concise. Just the key fact is, there's proof that you've read their profile. A final note here—send emails instead of messages. So, if you sign up for LinkedIn Recruiter Lite, which is about $120 a month, that will give you access to connect to a fire. It's a Chrome plug-in that makes it really easy to pull out email addresses from anyone's LinkedIn profile, and you'll consistently see much higher response rates through emails than you want messages. So definitely, definitely go for that.
Okay, the fourth place I'd try looking for engineers would be job postings or job boards. These are the two main ones for startups—Stack Overflow jobs and AngelList. I haven't included I Can Use jobs on there because it's only available to YC companies, but I Can Use jobs is unique and just has a particularly high quality of engineer. I rank it second on this list, actually.
Job boards in general do suffer from a quantity over quality problem. So, what's good about them is you don't have to spend a huge amount of time posting to one of them. The downside, though, is that the time suck can come in later because most of the applicants you'll get will be vastly underqualified, and you'll get a lot of applications from people who aren't even really software engineers. It will take a lot of your time reading through all of these resumes and applications to find the one or two good applicants.
So, to maximize your return and maximize the number of good applicants you do get, I'd recommend focusing on making your job listings unique and interesting. Bear in mind that the majority of job descriptions on the internet are written by someone in a recruiting or marketing department that's using corporate boilerplate language that isn't going to especially appeal to an engineering audience, right?
So, as a startup founder, you can serve experiment to try and bring through a bit of your personality in the job listing. So one thing you might try is writing in the first person about the personal story for why you started the company, why you're excited about the mission, and making it seem like you get that sort of excitement and passion across. Other things you could try include, is there anything unique about the culture? Is there anything specific about the technical challenges or product challenges you're facing? So, if you put in more details, that again will stand out because big companies tend to be very generic and vague when they're talking about what you actually get to work on.
The final source I talk about is actually just like physically in-person meetups. So, I don't think that these are actually going to be very effective, and they're sort of a long shot. The numbers don't really work out; like in-person meetups don't have that large of a number of people in attendance, and the truth is most of the time, people are there for free food and drink more than trying to actively find somewhere to work. It's unlikely that you're going to find both a really qualified candidate who's actively looking to move, who's excited about your company, and you also personally need to be very effective at talking to strangers and convincing them of things for this to work at all.
I've accrued it on there because I do know startup founders who have had success through meetups, but they are few and far between. If you do try this approach, I'd recommend focusing on technical meetups. By that I mean the local Closure programming group where people come together and meet up and bring laptops and work on things is more likely a better source of engineers than going to Dreamforce.
The final thing you could try here is hosting meetups at your own office, right? So you could also combine this with personal network hiring and use it as an excuse to have friends come by the office or have their friends who are engineers come by the office. At Triple Byte, for example, one of our engineers is a fanatical Emacs user, and he hosts the Bay Area Emacs meet-up at our office. It's not worked for us for hiring, but it is a good way to just meet and build a network of good engineers that could come invaluable at some point in the future, so it's definitely worth considering.
Okay, now I'm going to talk about when you should think about using recruiters. The truth is there's not really any hard rule on when you should hire a technical recruiter. I've seen companies of less than ten people hire one. I've seen companies wait until they're 50 people or more. I treat these as my opinions on when you should, and I treat these as rules of thumb.
First, I think you absolutely should wait until you've at least hired your first engineer before you consider bringing on a recruiter in any capacity. The reason for that is, actually I think general startup advice applies here. So one is just in general when you're hiring, it's good for you to do the job yourself for a bit so you feel the pain and understand the details of what makes someone good at that role at your startup in particular before you go and hire someone. You'll be better able to assess them.
The second reason I say is just it's like sales; it's actually good for you to go out and try and pitch and convince people to join your startup so you can understand yourself what message works and what doesn't. Because as a startup founder, you are always selling and you never know when you might bump into someone who could be a really great hire. And if you've already sort of practiced the pitch for convincing engineers to join you, you'll have those ready to go.
So, I’d recommend making sure you always make your first hire before you try and delegate this to a technical recruiter. Second, I'd expect to have a good hiring cadence—so somewhere around hiring an engineer a month for six months or so—before you start bringing on a recruiter. Otherwise, it's likely they'll run out of work to do quickly.
And then finally, as a rule of thumb, if you're spending more than 50% of your time sourcing—that's all the things I mentioned before—and doing initial phone calls and screens and getting people to come and meet you in person, over 50% is probably about the time where you want to start thinking about bringing on help because 50% is about the amount of time you want to be spending on hiring.
So recruiters themselves come in roughly three types. You have contract recruiters, who you pay by the hour, and they can do anything from just messaging on LinkedIn all the way through to doing initial phone screens. Then you have in-house recruiters—it's just hiring a full-time technical recruiter, and that works as a member of your team. And the final would be agencies; agencies are essentially teams of salespeople that are paying lots of engineers on LinkedIn or wherever they can and then sending their resumes out to as many companies as they can.
They tend to charge 25 to 30 percent of the first-year salary if you hire an engineer, maybe they sent you. They do tend to be fairly high-touch, so they're quite involved in trying to give you information that will help you close an engineer, but they're also sending that information to multiple companies at the same time. My recommendations here would be when you get to the point where you feel it's time to bring on help or bring on a recruiter, start with a contract recruiter and have them focus on just the sourcing piece of it.
So have them just focus on reaching out to engineers on LinkedIn and GitHub, and their main deliverable for you should be filling your calendar with calls with promising candidates. Then you're doing the pitching and practicing or yielding the pitching and convincing. When you get to a point where that becomes too much work for just you, then I'd consider hiring a full-time in-house recruiter and training them to do the pitch.
Cool, so now both sourcing and doing the initial calls and sort of setting up the on-site interview process for you. To sum up, as a startup hiring plan, if I were just getting going and building an engineering team, I would start by making sure you've exhausted your personal network—spend lots of time taking people out for lunch and coffee, making the ask.
Experiment with the hiring marketplaces; your mileage will vary on these depending on how effectively you can pitch your company, but even if you don't make hires from them, you'll get valuable experience pitching real engineers and real candidates and learning what resonates about your company to that audience. Third, spend some amount of time doing personalized and targeted outreach to engineers on LinkedIn and GitHub, making those messages really personalized.
Finally, treat job boards and meetups—meeting people in person—as a background process you run where you're not expecting to make any hires from them, but you're sort of building a general pipeline that could be valuable in the future. Cool, that's the first part of this. Armen's now going to talk about how to screen and evaluate the technical skills of engineers, then I'll jump back in to wrap up with making offers and closing.
"Awesome, thank you, Harsh. So I'm Arman, I'm Harsh's co-founder at Triple Byte, and before this, I started SocialCam with Michael Seibel, and I was an early employee at Twitch. I'm going to talk about just the screening step: so how to identify skilled engineers at your company.
So the first question here is just, you know, why you should believe me about any of this. One answer is that I've done a lot of interviews, so since starting Triple Byte, I've interviewed over a thousand engineers personally. But I think a better answer is that, you know, Triple Byte has a pretty special vantage point. We're able to see how candidates do in the interviews at multiple different companies, and we can see how the same candidate performs in multiple companies, and that gives us a dataset that I think no one else has.
And you know that data is where the advice I'm going to give today comes from. Before I jump in, I want to just go over the basic hiring process that most companies use, and this is actually pretty standardized. So probably 95% of tech companies use these basic steps to screen candidates. So first is a resume screen; someone applies to a company, they send in a resume, and a recruiter looks over the resume and decides if this looks like someone who's a basically good fit.
Then a recruiter call, so this is usually a 30-minute phone call with the recruiter—just ask about the candidate's background, judge culture fit, and make sure they're interested in the company. Then a technical phone screen; this would be between 30 minutes and an hour with an engineer, usually solving a single programming problem, so this is sometimes something like FizzBuzz or a little bit harder. Often at this point, done over a synchronized text pad of some sort.
Then, this is optional. Sometimes I take home project—so a substantial project the candidate completes on their own time and then sends in to the company to be evaluated. Then finally the onsite interview; they come into the office and do between three and six one-hour sessions with engineers at the company, covering individual problems.
Then finally, a decision meeting. Usually, the next day after the candidate has gone home, everyone who interviewed the candidate and the hiring manager gets together in a room and talks about sort of their perceptions and makes, you know, they as a group make a hire/no hire decision.
Some stats on this: companies make offers to between two and eight percent of all of the engineers who apply. But interestingly, exactly where in the process that drop-off happens is very different between companies. So we see companies where, you know, 75% of people who apply get screened out, you know, at the first step on a culture fit call. Then we see companies where almost all applicants make it through to the final interview, and that's where all this reading happens.
About ninety-five percent of people who are hired work out; about five percent of technical hires result in someone being fired within a few months. From the candidates’ side, what we see is a distribution of success in interviews. The top, you know, few percentage points of programmers by skill receive job offers after most of their interviews, but then the bulk of programmers are somewhere in the middle, where they receive job offers after somewhere between, you know, fifteen to thirty percent of the interviews they do.
An interesting point is that no one passes all their interviews, so there are not magical engineers who receive offers after every interview they do. This gets at what I think is the major challenge when designing any interview process, and this is inconsistency. This is noise in the interview process. It's kind of the idea: is your interview fundamentally repeatable and meaningful?
You know, if you could somehow, you know, rerun interviews with your co-workers, your employees, or everyone who passed an interview in the last year—if you could be any of those people, how many of them would pass again? And it's a pretty scary question. An interesting thing is that we've been able to add some data on this; what I did was I calculated a stat called the inter-rater reliability between all of the interviewers at all of the companies on Triple Byte.
What that means is this: this is the statistical measure of the extent to which different interviewers tend to agree about which candidates are best. It’s on a scale of zero to one, where zero would be no agreement, or the amount of agreement you would expect to see, you know, in random data from chance alone, and one would be perfect agreement. What I found was an agreement of just over 0.1.
So, the first point is that’s obviously much closer to no agreement than it is to perfect agreement. But for some context on that, I calculated the same stat on a data set of online movie reviews, and what I got was a agreement of very similar but actually slightly higher. So, it ends up that, you know, interviewers agree about which engineers are the best at about the same rate that Netflix viewers agree about which movies are best. And that’s scary! Interviews are fundamentally noisy, and they are more noisy than the data shows, more noisy than most hiring managers sort of want to believe.
So I know the first question here is sort of why do we invest all that if interviews are noisy, why do them at all? Why can't we just use something like trial periods to test engineers? And I think this is actually a great idea. It is almost certainly much more—you know, if you can work with someone for a week, you can almost certainly get a much better sense of if they're a good employee than you could in an interview with them.
The problem is that most engineers don't actually want to do trial periods. So we did some research on Triple Byte, and it ends up that only 20% of engineers are willing to do trial periods. And there's actually some adverse selection there, so the, you know, most of the best programmers are in the 80% who would prefer to do a standard technical interview because it takes less of their time and is faster.
So I think that trial employment is an excellent option, and you can offer that as an option, but you know if you don't want to scare away most good programmers, you do have to still run a standard traditional interview process. So, we're going to talk about today is you know specific pieces of advice that you can use to try to reduce the noise in a traditional interview. And I'm going to go over seven points.
So, point one: the first way to reduce the noise in an interview is to decide what skills matter for your company. There are a lot of different ways that a programmer can be skilled. For example, someone can be very productive, or they can be slow, careful, write great tests, and make sure they don't commit bugs. Someone can be strong in math and computer science, or they could be, you know, deep knowledge about the internals of the Linux kernel and, you know, scheduling and real-time operating systems or something.
And if you don't decide as a founder which of those skills matter for your company, then your interviewers are going to decide that for you. So they're going to come up with questions that they ask you in the interview process, and they will fail people who answer poorly on those questions whether or not those are areas that should matter for your company. This is actually a major source of noise in interviews.
Every engineer has this bias where they think the things that they know the best are the most important skills one could know. In the absence of specific direction from above about what to look for, they will fail people for areas that are not important for your company. So my first piece of advice here is that you should go through and ask yourself these questions before you start hiring.
The first is: do you need sort of fast iterative programmers, or someone who's careful and rigorous? Do you want to hire someone who’s motivated by solving hard tech problems or someone who is motivated by building beautiful products for users? Is it important that someone comes in with skill in a particular technology pick your language, or can you hire a smart person and let them learn your tech stack on the job?
Is academic computer science or algorithm ability something which is important for you, or is that an irrelevant skill? Then is there any other sort of specific expertise that you need in people you’re hiring? It’s actually fine to answer both some of these questions; you don’t have to specify only a single profile of person you’re hiring. The important thing is to decide what matters, even if that’s multiple profiles.
That sets you up to design the rest of your process and make sure that you're not failing people for being bad in irrelevant areas. Okay, point number two: the second way you can reduce noise in interviews is to use structured interviews. To define this, a free-form unstructured interview is an interview where an interviewer gets in a room with a candidate and they ask questions and they follow their intuition.
They, you know, based on the answer the candidate gives, they ask follow-ups. They try to get a sense of what the candidate feels like, if that person can be a good fit at the company and at the end, make a global yes/no decision based on that entire interview. And that contrasts with a structured interview where the interviewer comes into the room with a question which they're going to ask and a defined criteria they're trying to evaluate.
The very interesting point here is that everyone thinks that that freeform interview feels better—almost all interviewers prefer freeform interviews, and they think they are more accurate in many candidates. If you ask them, they actually say they prefer to be interviewed in that way. But the things is this is completely opposite to all of the available data on this.
Structured interviews are simply more predictive; they are better at predicting success on the job, and there's no excuse not to use them. We should all be using structure interviews. So in the first example, what that means is that you have to hold the process constant. So, the goal of an interview is essentially to evaluate variants in candidates, and you know if you do not hold the rest of your process constant between candidates, you're introducing noise.
So there's simply no excuse for not asking every candidate for the same job the exact same set of questions. I think the reason this is not more common is that the interviewers themselves tend to find this boring. I can say it sucks—it’s not great—but you have to do that.
So, second point on structured interviews is that you want to give your interviewers defined criteria to evaluate. So rather than putting them in the room and saying decide if this person is going to be a good fit for the company, say we care about coding productivity and you know knowledge of back-end web systems, and so your goal is to ask this question in the interview and grade the candidate on coding productivity and knowledge of back-end web.
There’s actually some great research on this. It ends up that a lot of the sort of worst biases that can come up in interviews are made worse when the interviewer is trying to make a global decision. So, if something is making a global decision like, "Does this person feel like a good employee?" that’s where things like, you know, do they look like someone you knew in the past or you know, their race or gender—those things come into play.
Interviewers are much better at ignoring those criteria, ignoring those attributes, when they're given a defined criteria to evaluate. Then final point is that you want a unified decision-making on this. This applies mostly to larger companies, but you want to make sure that one person or one group of people is involved in all the final decisions. So rather than viewing interviewers as people making decisions, view them as people taking notes, grading candidates on criteria, and that all those notes and criteria go to a centralized person or a centralized group who makes the final decision.
The idea is just that, again, the goal is consistency. It’s far more easy to be consistent if one person is making all the decisions. Point three for how to reduce noise is to just use better interview questions. So I have some tips on that. The first is that you want to avoid questions that require a leap of insight from the candidate and rather use questions where there’s a gradual sort of ramp of steps that they can follow that lead to a solution.
The general rule of thumb I like is that you can ask yourself, "Can this question be given away?" So if it's a question that has a sort of single fact in the solution which a candidate could know in advance and perhaps be communicated them by a friend or by anything they read in Glassdoor, then that means that it's probably a bad question.
An example of that is the classic question: "Imagine you're standing at the bottom of a flight of stairs, and every step, you can either take a single step up one stair or a double step up two stairs; how many unique paths are there at the top of the flight of stairs?" That ends up that the solution for that is the Fibonacci sequence. Kind of strangely, if someone knows that, obviously, it’s the solution.
If they don’t know it, they may well struggle and think about it, going down some rabbit hole. So that's exactly what makes it a poor question. An example of a good question is, “Can you please implement the game Connect Four for me?” So there, you have a series of steps; each one is relatively straightforward but still leads to a solution, and there’s nothing a friend could tell them in ten minutes which would give them a massively unfair advantage in a game Connect Four.
So, another idea here, this is kind of related, but you want multi-step problems; and those tend to lead to problems that don’t have insights, but also candidates will often get stuck in an interview, even good candidates, and if your problem has multiple steps to it, you can give them a hint. You can help them out in one portion, and they still have enough left for them to go on and do well and be themselves and demonstrate skill.
Whereas if your problem is all in one sort of nugget of difficulty, if they can’t solve that, you have to help them at that point; they've basically failed or done poorly on that section. You want to avoid specialized knowledge. So this is, you know, if your goal is to assess general program ability, you probably want to ask questions that involve you know—things like lists, hash tables, and strings—rather than questions that involve tries or prefix trees.
Unless you've decided that you really care about algorithm structure knowledge, and your goal is to make sure that they know about tries, in that case, it’s totally fine to ask about them. But if you're measuring something else—if a try is then some portion of your candidates will be familiar with and others won’t—and that will introduce noise.
So, in general, I think it's good to stick with the sort of classic, most basic computer science concepts. Another tip is you want to estimate about three times the amount of time it takes you to solve a problem for the candidate. So if you come up with a question and you solve it yourself, it takes you ten minutes; that’s probably a good question for a 30-minute interview session. The reason here is that it's far easier to be the interviewer than it is to be the candidate.
It's far easier to ask the question, and we tend to downplay how hard the questions actually are. We can actually do some research on this; we went through all the questions we asked at Triple Byte, and we looked at which were most correlated with candidates going on to succeed, and it ends up that the most effective questions tend to be much easier than our intuition going in was predicting.
So it ends up that most interviewers think that the optimal question is quite a bit harder than the optimal question actually is. You want to make sure that you ask four or more questions in an interview. The idea here is that each individual question carries a certain amount of noise. As the previous person seems questioned before, you know, did they have whatever—they lucky answering it—if you ask more questions, you’ll get a more consistent signal out.
Just one type of question we like a lot at Triple Byte is a question where we give the candidate a problem. Rather than wanting them to devise a solution, we tell them a solution to the problem, and then their goal is to take that idea and implement that in the code. So, we give them an algorithm, and we see if they can implement that algorithm rather than require them to devise an algorithm.
A fourth way to reduce noise is to ignore credentials during your interview. By credentials, I mean things like did someone study at a well-known school or has someone worked at a well-known company. I'm not claiming that credentials are meaningless; credentials are important, right?
So, in the productivity case, the group of people who, you know, are ex-Google alumni are indeed better engineers than the group of people who have not worked at Google. So it’s totally legitimate to take that fact into account when deciding about the hire for someone. However, it’s not relevant to their actual programming skill.
So my advice here is to make sure that you're not biased by the fact that someone comes from Google when you're narrowly evaluating the programming skill. I recommend you hide that credential from the interviewers, and the reason is we found that interviewers are actually pretty biased by this. If they know someone has, you know, strong credentials, they're more likely to interpret the result of the coding screen as in a positive fashion or think, “This person, you know, yeah, they didn't know that answer, but I'm sure there was a temporary slip-up."
So, how do you anonymize credentials from your interviewers? Let them assess program ability, and then when making the final decision in the decision meeting, consider credentials and also the performance in the interview. This will help you find programmers who are skilled and who lack traditional credentials, and those are the undervalued people in the market.
As a startup, if you can get good at finding those people, that is a big advantage. Okay, point five is that you want to think about the false negative rate in your interview. A false negative is when someone fails your interview who could have gone on to do the job well. The opposite of false positive is when you have someone pass.
If you hire someone who then goes on to do poorly, they’ll probably be fired. Both false positives and false negatives are very costly. So, if you hire a bad person and you fire them, that’s terrible. It hurts morale for the team and it’s also very expensive—it’s actual money.
But you know if you're a startup and you're really hungry, you're held back by not having engineers. If you pass up a person who could have joined your team and been productive, that's also very expensive. So I'm making a pretty subtle point here, but I think there’s a bit of a cognitive bias where false positives are very—we're very aware of them. We’re very likely to hire a bad person, we feel that pain for a month or a month plus best case.
In our experience at Triple Byte, they generally have too much faith that the folks who failed their process must have been bad engineers and couldn't have done the job, and that’s empirically not the case. Right? If you watch—I've said this earlier—no engineer will pass all interviews, right? So a significant portion of people who fail interviews do go on to be employed very productively at other companies, and so I just recommend that folks designing any of your processes think about the false negative rate and try to give that some weight in their calculations of setting hiring bars.
One of my goals would apply to actually just get to a point where you can actually measure this rate because no one knows what it is. No one knows what the false negative rate on the interview is because to measure that you have to just hire people randomly and see how they perform—the road is very expensive. My goal is to get to where we can do that.
Let's see—okay, point six is that you want to genuinely calibrate on the maximum skill that each candidate brings rather than their average skill or their minimum skill. So someone comes in an interview, they're very strong in one area or they're weaker in others; what matters the most is what they were strongest in. Everyone can look stupid, right? Everybody, you know, if you ask me the right question, I will definitely look very stupid; that's true for everyone out there.
So the problem is that someone might go through an interview and do very well on some things that would be useful for the company but look stupid on one question, and you know if that interviewer gives a blocking no that that person was stupid, that's introducing noise in the process. Perhaps now again, if someone does poorly in an area that is important for the job, totally failed them. But be open to the fact that everyone does look stupid sometimes and don't fail someone just because they look stupid on one portion of the interview.
Okay, a final last point here is that you want to think about the candidate experience when designing an interview. You want to make sure that every candidate who goes through likes you, likes your company. This is true for a few reasons. One is just, you know, if they enjoy the process, they have a higher probability of accepting an offer you make, and so this will help in the closing step.
But an interesting point is that it will also actually make your screening itself more accurate because stress has a big impact on performance. A high percentage of candidates get very stressed in interviews and underperform their actual peak ability. So some tips here to help you de-stress include just letting everyone bring in their own laptop and work in their own environment.
Throw them a bone but let them use their own language, their own tools—they will be much more productive and less stressed. Then coaching your interviewers in just some soft skills—so being friendly, providing breaks for the candidates. When they are doing poorly on a section, train them in how to intercede in a way which isn't too stressful or insulting to the candidate.
The rule of thumb here that comes from the old Joel on Software blog is that you want every candidate, no matter how they do, to finish your interview wishing that they had, you know, wanting to join your company. Even people who fail your interview very poorly—you want them to end your interview wanting to join you—being excited about the opportunity.
And the final point is you want to avoid hazing. So this is rare, but results in some of the worst horror stories of interviews—this is where, you know, interviewers take on the role as some sort of ritual of acceptance into a group. You know, think if this happens, it's terrible! And as a hiring manager, you can just stay totally away from that.
Okay, those are my points. I want to emphasize that I'm not saying that you should lower the bar for who you hire. I think if you follow this advice, you will get the more accurate signal. You can then set your bar higher if you want on that signal; but you’ll still be making, at that point, better hires.
So just to summarize here, I recommend that the first step you follow is to decide what skills matter for your organization and make sure that you're screening on things that you actually care about. Then design a structured interview around those skills, come up with ways to assess each skill, and stretch the criteria for the interviewers, so they’re less likely to be biased by outside factors.
Then you want to use good interview questions, you want multiple parts that don’t require leaps of insight, you want to hide credentials from the interviewers because that introduces noise in the process, you want to think about the false negative as well as the false positive cost in your interviewers, and you want to genuinely calibrate around the maximum skill that each candidate brings.
While doing all that, you want to try to provide a positive experience for the candidate. So those are my tips so far. I think this all applies to both big and small companies, so I'm going to go over a few points here that I think are specific to, let's say, Series A and smaller companies.
So one point here is what if you're so small you don't have the scale to standardize your process? So, your seed-stage startup—you're hiring your interviewing your first few people—you obviously can't run an extremely standardized process because the first two candidates have seen those questions. That’s a totally real point, but I think it’s still worth trying to run structured interviews.
So, you won’t have—you’ll still, you know, your Playbook will be simply a Google Doc with some tips that you’ve written down, but it’s still helpful to think about what skills matter and try to design questions that assess those skills. That will still reduce the bad bias in the interview.
Let’s say that you're in trouble sourcing, so you're at a Series A startup and your number one problem is you don't know how to get enough qualified applicants for your company. In that scenario, it's fairly obvious that a false negative costs more; so if you're struggling to get people to apply to your company, screening out someone who's bad is all right.
So here, someone who could have been good is very expensive—more expensive than for a large company. But the flipside is that if you're a small company hiring a bad person, it's also way more cost-effective; you know it's from way more expensive. It’s not cream if the ratio of those two costs changes. So I think you still have to care a lot about both.
What you can do is be less aggressive about screening folks out early. So if you’re an early-stage company struggling to source candidates, I recommend you are less aggressive in speeding out your onsites. You know, pass more folks through and accept a lower final interview success rate in exchange for better screening on all the applicants.
Let’s say that you're a small startup and you don't know what skills matter; so you're not sure if you want to hire someone who’s very CS-focused or someone who’s very, you know, web-focused. There’s no crisp answer here, so we have plenty of examples of, you know, billion-dollar companies that have taken various routes here. My personal advice, from Triple Byte and from SocialCam and from Twitch when it was small, is that I believe strongly that the two most important skills for the first few hires are productivity and ownership—being able to basically take a project, figure out how it should be built, and just make that happen.
I recommend applying for that at the expense potentially of code quality. So I think the first few hires at a company should accept that perhaps they’re writing crappy code, but by God, they’re writing it quickly, and they're getting stuff done. Yeah, just say that was the case during the early stages of all the companies I've been involved with, and I think that’s normal and probably good.
So let’s say that you’re hiring for an area where you yourself don’t have technical expertise—so you're a web developer and you need to hire an iOS engineer. What do you do? So what enters you can—sure, you can use Triple Byte to help you do that. You can call friends you know, if you have them. But I think a good point here is that a trick that I've used in the past is to just ask them to explain things.
So if you’re a web developer, and you're asking the iOS engineer a question, they give an answer, you have no idea if it’s a good answer or a bad answer. What you can do is just ask them to explain. You say, “Oh, that’s interesting, can you explain why that’s a good idea?” For the most part, if someone truly is a skilled iOS engineer, they should be capable of explaining their answer in terms that a web screen developer might get to understand.
It’s not always the case; sometimes someone comes along who’s, you know, good at assumption but a bad communicator. That might fail them, but this is a pretty good trick to hire outside your area of expertise. Okay, that’s basically I want to end with an ask for you guys.
So this is—there's a tool we’ve been developing an exercise for interviewers that we use when training new interviewers at Triple Byte, and I want to see if I can get you guys to try this and sort of email me and let me know how it goes because I’m curious. But this works more broadly.
The idea is that interviewers tend to sniffle underestimate how noisy interviews are. They overestimate their own ability to distinguish good engineers from bad engineers. Part of becoming a better interviewer is to sort of become a little more humble and more aware of your limitations, and so this exercise is designed to sort of highlight that.
So what I want you to do is to do a mock interview with one of your coworkers. So have one of your coworkers interview you where they're role-playing. You know, they're asking questions; they're the interviewer, and you're role-playing a candidate, giving answers.
I want you to tell them in advance that you're going to be giving bad answers to some of the questions. You’re going to roleplay a candidate who makes mistakes, and then during the interview, they ask questions. I want you to half the time go ahead and give a bad answer; roleplay a poor answer to the question, and the other half the time, give your best shot and actually try to give the best possible answer you know to that question.
After the interview is done, do a debrief session where, you know, your coworker goes through and sort of goes over all the mistakes you made and gives you feedback. What's great here is they don’t know what answers you gave were intentionally bad and which were you trying your best.
If they do look—if they don’t highlight a mistake you made where you were trying to make a mistake, they’re going to look a little bit bad, so they're fully incented to be completely honest and rigorous when pointing out what flaws they saw in the interview. You know, as experienced engineers like me, it’s been a while since you’ve had someone really brutally critique how good you are at answering your own questions.
And that, you know what will happen is that they will point out legitimate flaws; you'll be trying to give your best answer, and they will point out things you said that are wrong or things that could be better. That’s very humbling, and I recommend that they do—that’s I think this is an exercise you can give yourself to have that experience of being humble and having flaws and even your own performance pointed out in the middle.
"Yes, got it! You talked about a 95% plus success rate on one of the slides, and I have given the fact that the interviewers tend not to have a very high rate of reliability; it seems time to believe that there's a 95% and also doesn't match my experience that 95% of the hires are successful. So, when you think is the real success rate where you, you know, after hire something still there and they are popping before?"
"Yep, that's gonna be. So this comes from, some of the surveys we did upon companies, okay. Sorry, so the question was I—my slides said that there is—there are 95% of tire are successful—we don't mesh with, you know the experience of being a hiring manager quite peer many hire are not not great. So the question is what's the actual rate at which, you know, candidates end up being top performers and yeah questions totally right.
So, that number I 5 comes from some surveys we did, where we asked companies of all their hires, what percentage got fired. We actually asked where stands our top performers, and that's five times the rate they fire. So 5% actually get fired, you know, an additional 30% are people who stick on but are not particularly great employees, and then about 10% are in the bucket that companies said were their best hires and top performers."
"Thank you!" [Applause]