Hiring Engineers with Ammon Bartram
Hey guys, today we have Almond Bartram, co-founder of Socialcam, Triplebyte, and he is here to talk to us about hiring. Could you just give us a quick intro about what you've worked on?
Cool! So, I joined Justin.tv fresh out of school in 2009. It was just 25 folks, and I went to the roller coaster of the early days of Justin.tv. There, I worked mostly on the video system, and I think that my first sort of taste of hiring, you know, we were, at the end of that, hiring pretty aggressively. That was my first realization of how much noise there is in the hiring process.
That's okay.
And then, I was part of a spin-off of Socialcam, and that was a video sharing app. I did that for about three and a half years, and we were acquired by Autodesk in 2012. I worked there until 2014, then took a bit of time off and started Triplebyte.
And so, Triplebyte, just for context for people, can you explain?
Sure! So, we're a recruiting startup. We help startups hire engineers. Engineers apply to us, and then we do sort of a full interview with them. We then pass the engineers who are good and match them with companies where they have a high probability of doing well.
Cool! So people ask us a million questions about hiring, recruiting, all of it in general. Let's assume that you're, you know, an early-stage startup. What should companies be looking for in an engineer?
There's not a crisp answer to that. I think the sort of the pickiest answer I can give to that is you have to decide what it is you want to look for, and then you have to effectively look for that. Sort of the status quo, actually—I think there's sort of an important root here. Most companies think that they are trying to hire good engineers. That's what they say to themselves, and at the end of the day, they don't realize that, you know, company A's definition of a good engineer is significantly different from company B's.
What you have is a situation where everyone has this definition in mind, and they're all different, and this is a big source of noise. For example, if one company thinks that engineers need to be very fast and productive and be able to show that in an interview, and a different company thinks that engineers need to be very intellectual and be able to deeply understand computer science topics and talk crisply about that, what happens is sort of all of the awesomely productive engineers who are very practical, not necessarily strong academically, apply to the second company. And, you know, all of the very academic engineers who could totally solve all your hard problems but maybe aren't quite as flashy in that way, well, they're at the first company.
So it's, you know, what comes out of it is a bit of a larger stage. I think the obvious answer is you want to have both those types of people, and so it's about building a process that can identify more broadly different types of skill. For smaller companies, I think you're more in a situation where you may well actually only want one of those folks.
You may need someone who's going to come in and be productive, and sort of bang out code. If that's the case, you need to realize that it's not important that everyone you hire be strong in computer science. If you are, you know, a company where you're facing security issues and you know really clean code with precise code and solving hard problems is important to you, then, you know, at an early stage, it probably makes sense to have a process that skews more in that direction.
Do you have general advice for people that come to you guys at Triplebyte or just you as a friend advisor for diagnosing what kind of engineer is a good engineer for your company? What do you tell people?
It's funny; we're really good at a situation actually. I think most people have strong preconceptions, so we're more often in the situation of broadening people's vision of what a skilled engineer can be. But I think the obvious answer, I mean, something limited to...
To go back to the question, well, I think what happens a lot is the mistakes that people tend to fall into is when they're interviewing an engineer, they tend to ask about the things that they are the best at. There's this overlap between the things that you're the best at and the things that you think are most important.
Every engineer thinks that the things they know are kind of the core of the discipline, and so you ask everyone who you interview those questions. You end up hiring engineers who have those skills. They join your team, they ask to play in the interview with the same type of questions, and so the whole organization can grow in a direction that might not make sense.
Okay, you know, it's very complicated because there are plenty of examples of companies with, you know, defined engineering cultures that have worked out very well. So, for example, Google, you know, intentionally or unintentionally, grew very much in a computer scientist direction, and that's obviously worked out very well for them.
You know, there are companies today that I don't want to say names, but those are companies that take a complete opposite approach—a very human productivity-friendly approach—and it also seems accessible excellent companies.
So, you know, answering that question is not basically that cases on both sides. I think, as you're hiring your first employees, I think you just need to try to decide what's holding us back.
So, say I'm trying to vet this pool of engineers and they all fit the certain rubric that I've created, right? But maybe one of them did a bootcamp and has some projects, and then one of them, you know, went to a great school and has a CS degree. So how should I think about, you know, credentials and experience—boot camps versus CS degree?
I don't think they are all that different now. Okay, that's obviously a forceful statement. I think experience matters more than where you got your education. Someone could be fresh out of a CS program who doesn't have internships. They're still essentially a junior engineer.
Perhaps they may have any more academic slant to what they've studied than someone out of a bootcamp, but both those people lack real-world experience. I think calibrating that differently than someone who has, you know, worked in the industry for five years and can own projects is the big source of bias.
So you still... You can most easily measure in an interview is the ability to think through small problems quickly. That's really what you do is evaluate. The skill that you need in an employee is the ability to solve large, sprawling problems well, but over a long period of time.
There's obviously a correlation there; you know, it's not, we use interviews as a proxy of evaluating actual skill because there is the correlation there, but the correlation is not perfect. An interesting observation is that people who are fresh out of university and boot camps actually, in many cases, because they've been practicing, are better at the kind of problems that get asked in interviews than your very senior, you know, 8-10 years of experience at a large company engineer.
What the senior engineer typically has is experience making architectural decisions, owning projects, gathering requirements, carrying out the whole process.
How do we evaluate for that? It's super hard—basically what ends up happening, and this is honestly unfair, is that experience gets you issues as a proxy for that, and this is something where folks get a lot of fill by. What you know, this is very strange actually—if you're a junior or five years of experience, it's just easier to pass the interview. You will get a job offer.
You know, after our report, people think maybe that senior engineers should perform better than the junior ones, and that's actually not generally true. The bar for getting a job offer goes down as you have sort of a present looking resume. And it's not irrational on the part of the companies; it's just the reality.
Sure, okay, and so then when you guys are pre-training these people for Triplebyte, for example, what are you looking at? What are you having them do?
So we... The approach that we take is to evaluate as much as we can in isolation and be aware of what we're evaluating. So we explicitly evaluate programming, just proving creativity given a relatively, you know, aspect out problem.
So an example—we like describing an algorithm to solve a problem. It's not super math; they just choose the set of steps they have to do. Can the candidate take that and put it into well-structured code?
Interestingly, junior folks actually often do better than senior folks at that sort of problem. We then separately do an evaluation of sort of academic computer science skills. You know, is the engineer knowledgeable about computer science and about that approach to problem-solving?
We then separate... One thing that we took some strife actually is we do a debugging section. We give the candidate a large codebase that has some bugs, and we ask them to sort of take some time digging in the codebase and then try to find and fix these bugs. I think they do such a great job of solving some of those problems, basically because this is a skill that comes from experience that is often missed by more and more traditional interviewers.
And then finally, we do a system design section. So, you know, here are the requirements; design a production web system to satisfy the requirements. Now I'm going to change the requirements; okay, how do you adapt your design? How do you talk about trade-offs?
And all that's done remotely because the person's at home, right?
Yes, we do this all over Google Hangouts.
Okay, and so what's a reasonable amount of time for someone to just, like, go through one of these exercises? Are they all widely different?
Our reviews are about two hours in length, and so we spend about 30 minutes on each section.
Okay, cool, and you find this is like a very strong data set in terms of correlating how successful they are?
Yeah, what we've done—about 2,000 interviews over the last year and a half—and so we've been able to sort of drill in on the parts that are most predictive and cut time, often shortened it. I think if you're starting from scratch, you run it about twice the amount of time to get you all that stuff.
Okay, and so, having gone through all these interviews at this point, was there anything that you thought was really important in the beginning or something that's very common in the valley that many people think is important that isn't really important?
Too much reliance on a single question. So the sort of the classic interview format in tech companies is, you know, a number of one-hour, 45-minute, one-hour sessions. Engineers pick the questions themselves, and they're usually with little nuggets of...
I know there's much pejoratively called brain teasers. I don't think... Almost no one actually asks brain teasers; they're more like legitimate programming problems. But they are those nuggets of difficulty—like how do you, you know, given a string of parentheses, how do you turn up there? Well-matched!
Okay, given multiple types, how do you turn it on? Match if you don't see—no. You know, that's a classic and good question. And it ends up there's just a huge amount of noise. If you just take a bunch of candidates and in a controlled setting have them all answer, you know, three or four of these questions, you'll see it; there's some correlation, but there's way less correlation than you would think.
And companies—and honestly, I believe that in my previous... You have this question, you ask another question, and you see this variation, you assume, oh, the people who asked this question well must be smart and good programmers, and people will get totally tripped up. It must not be.
You actually inspect that, and you see that there's this huge amount of noise, and like we have this pretty incredible lens on this because we exaggerate and adjust pretty rigorously and then send them to multiple companies and see what happens. We get detailed feedback.
Is that feedback from the actual interview process, or then once they're placed, you actually know as well how they're doing?
Both. Okay, but I'm talking about the interview process.
So we send engineers to companies and then do their interviews there, and we get feedback. It's just pretty incredible how much disagreement there is.
So, a candidate who, you know, does really well at one company and they get told, this is what the best person... Even in months, as their rockstar goes on to fail an interview about somewhere else. An interesting stat we dug up is I compared the rate of agreement between interviewers at companies with a dataset of users reviewing movies online, and the numbers were essentially the same.
The integrator agreement was essentially equivalent. So, basically knowing that an engineer did well at one company gives you about as much information about whether that engineer is skilled as, you know, knowing whether the New York Times film critic rated 12 Years a Slave as excellent or terrible.
So, okay, maybe you don't have an answer to this, but say I'm really good at brain teasers—where should I interview?
I would say your company's probably larger—so this makes a certain amount of... Okay, this is all very complicated, and there's a certain amount of sense. So, bigger companies... Brain teasers always introduce noise, but we've found that bigger companies rely more on that type of interview, and I do that partly for some rational reasons.
Bigger companies care more about measuring your innate ability and less about measuring whether you can jump into their particular codebase and be productive on day one. But it's way more likely that a smaller company is going to say, we're using, you know, Ruby on Rails; we need a very black cap Rails developer. You know, come take this interview and organize how well you've been—maybe how about you can work on our actual codebase.
Whereas the big company, you know, Facebook, Dropbox, Apple, Google, is more likely to say, we care about smart people. And, you know, within the confines of the noise of your process, they're trying to identify intelligence rather than specific experience. I mean, they also have the capacity to train.
Yeah, for cycling. Whereas like a small company, in a way...
Okay, and then, prior to one of the questions, what about skills that people don't think are strongly correlated to a successful engineer?
Relatively easy problem-solving. So we have found that asking pretty easy interview questions is often more generally predictive than asking harder interview questions.
So, sort of, iterate this down—there are kind of, there are two sources of signal from asking a question. You can get a signal on whether the candidate comes up with the right answer, and you can get a signal on whether they, you know, would let they struggle. How hard is it for them to solve the problem?
We square both these things from us off of questions, and done this for, again, thousands of candidates. What we've found is that you could go in and look at how much, you know, the individual score on one question we're asking correlates with how the candidate does on the job.
What we've found is that, as you'd expect, you know, getting a question right is correlated with the agenda. And as you would expect, being able to answer a question easily is correlated with being a good engineer. But there's also, of course, you know, false negatives, right? So there are great engineers who fail to answer your questions, and there are great candidates who struggle with questions.
If you look at it like the actual predictive ability—not just correlating getting on the right side—the sweet spot is actually far lower on the scale than most people intuitively think.
Can you give a couple examples of what those easy questions might be?
Yeah, sure. Just ask, like, saying, like, you know, we want you to create a checkers game. Absolutely no... you know, no large, that... just a class that has a board, it has a grid, it's got pieces, pieces move around. This is really a pretty mundane, straightforward task that actually ends up being... you know, how well can they do that ends up being a more stable predictor of engineering skill than sort of, you know, here's a, you know, here's a... you know, here's your sentence that consists of a list of words all glued together, and you define the optimal way to... you're given a dictionary; it breaks apart into words.
The second problem is, you know, the visiting graph source problem that can be, you know, optimized with, you know, memorization of them.
The second game, the second problem, right? It carries more information, you getting the first problem right, but with a great high false negative, right?
And so the first problem is actually being a better general predictor of engineering skill.
Is there a way, if I'm getting ready for a new, you know, I'm going to another company, I'm going to get ready to interview—do you recommend people train in any particular way, or is it because you're going for that sweet spot of easy questions, like you just have to be smart enough to do it? What do you tell people?
To be in general, yeah, if I'm going to prep to do some interviews, what would I do? So, I mean, I guess there's those two questions there. One is what I think for companies that are doing a good job interviewing, and then maybe for, I think, some sort of status quo—this is in general.
It sounds—where it’s coming from... I'm so very, very advice for new grads and for experienced folks.
Okay, so let's break it apart. Yeah, new grads.
For new grads, I would say, you know, make sure you're solid with sort of the classics. You know, stuff that—so, you know, breadth-first search, you know, hash tables, we're going to keep the Gordon’s—what classical computer science... A surprising percentage of an upcoming question ends up being slightly obscured applications of sort of those, but especially, you know, hash tables and breadth-first search; those two things by themselves represent probably 40% of the questions that are asked at the most companies.
And so you need to know those, but actually, many new grads are already pretty solid on that because they've been being drilled throughout school.
The second thing is practicing writing code under stress. So working out a big problem over time is very different than you have ten minutes or 30 minutes, here’s a, you know, because a marker writes in a wipe or even here’s a laptop program on it.
So just, you know, those things correlate. This feels correlate, but you can improve your performance by practicing. So totally, you know, put in, you know, 30 minutes a day finding some new questions online and giving yourself a time limit and sort of trying to solve more effectively.
And where are like resources that people can look for? We get anything in particular?
I mean, the classic ones, right? So Socratic coding and view has a pretty good list of questions. The other advice in that book, I don't think really applies to startups very much, but the questions are good, and then there's a bunch of sites online that have lists—any good cake is one that I've seen; I think it’s high quality.
An interesting aside to this though is that most companies actually want you to do these things. So it's not complete; they would prefer that all their candidates we prefer. We totally prefer if we try to design our interview in such a way that there’s no impact, right? We don't really want to be measuring if you've been cramming on algorithms, but what many companies want to measure is, like, max scale, max potential.
So they would actually much rather see you, you know, in a state where your role is paramount in there rather than the state when you have the potential to understand it but forgot about it.
Yeah, I mean, what—an interesting trend that's happening in the industry is companies being more upfront about what they're asking. So Facebook, for example, has started providing any sort of interview prep class to everyone who applies.
So that is sort of going over the material. It's part of—you find it encouraging because it’s moving in a better direction, but that's also discouraging because it's like really sucky that you have to, like, take a class.
I just wonder, like, if it's filtering for those types of people, right? Who are just like looking for, like, I don't know what to do; I don't know what to do. It's like, well, if you apply here, you could take the class and then, like, take it. Let me hold your hand the whole way through your life.
Right, yeah.
Okay, okay. So say I am going to interview at a bigger company, is there a way to prep to do well with the brain teaser stuff?
Practice! So again, there are some words that are thrown around describing any questions that aren't brain teasers they are pretty rare. Some companies probably have asked things like, you know, the golf balls in a 747, but that's really very rare. Much more common is applications of a computer science idea to a practical problem.
And there still is this leap of required in many cases. I think those are bad that need questions, right? You don’t like the—you know, a company should try very hard to ask questions, whether it is not like there’s one thing that has to be grasped until that grasps, that the problem feels impossible.
But you know, but hard applicator... you know, sort of private application of computer science topics represents a significant majority of questions that big companies... and so, when I was in college, I interviewed at one of those, like, big management consulting firms, and they did have all those questions like humans prepping for it. I didn't get the job and I did okay, unlike the stupid, you know, ping-pong ball question.
I don't feel bad. One thing we have that's interesting is that the engineers who do the best at companies go on to pass about 80% of their interviews, but not 100. Almost no one passes more than eight. And everybody’s at companies, alright?
And, yeah, so, you know, one big bit of advice to everyone is just, like, don't feel that if you fail, you know, it's really not a referendum on your skill. I'm very happy to have not gone in that direction.
So, what about the role of, like, you know, projects—someone's portfolio of like side projects? Are there certain types of side projects that across the board are attractive to companies, or is it like, you know, say I'm applying for a job at Stripe and like I did, you know, X payment-type project, and that would be more attractive to them.
So across the board, are there things that are interesting?
Let me talk about my question, mingled fact, but I think, you know, the right name companies to do it. Companies don't actually pay very much attention to side projects, okay, except for at the screening stage.
So, resume screen, you know, you can—it applies to company companies. They have to decide, click on any of the person at all, and there's some adverse selection bias in who applies to the companies. And there's this big stream of candidates.
And so at any company, there's this huge game coming in, and they have to decide somehow. They do that based on resume screens, and it comes down to pretty dumb credential stuff, right? If you've worked at a top company, you've gone to a top school, or in some cases, if you have a project.
But the catches are, right? That's impressive and so it’s Preston helps a lot there, okay? But they are very rarely given way in the actual interview, and I think it's actually probably the right decision.
So people who have slide parks sometimes feel bad about this, yeah, but the reason it's the right decision is that most engineers don't have side projects. Most juniors have been working at a company, and it's all proprietary code, and there's very little they can show that's, you know, 8 out of 10.
And even you don't in that situation. And having a consistent process, consistent, fair, you know, like contingency is the first goal is not going to be processed so that, you know, the big problem is that the process is not used consistently.
We've made it consistent; then you can optimize it. And having this sort of other process, we look at projects interest, it’s noise, and it's also just really hard to do. You can't tell if someone spent, you know, a weekend on the project hosted, or working out so it’s like the last ten years.
We have literally seen both pretty regularly. We talked about their projects in some gated over weekend, or this has been my, you know, abiding passion for the last ten years. Let alone, like, who actually contributed, yeah, delivery.
And, you know, things like coding quality, right, it's actually... It's startlingly hard to look at a big bit of code and decide if you think the program who wrote it is skilled.
Yeah, and so, you know, for that reason, for all those reasons, you know, the side projects are useful. So if your problem as an engineer, if you're ignorant about engineering, if you're applying for jobs and you're being speed not allowed to resume on stage, doing these projects probably helps.
I’m doing part of the great way that obviously increases your skills, and that will reflect it in better performance on interviews.
But, I don’t think projects have a very big role in the actual interview.
So, what other things should I think about if I am being screened out? Like, say, like, you know, I'm getting a callback from, like, one out of ten. What should I do?
Slideshow byte, you know, otherwise…
Yes, that's why projects help; I just... sucks, right? It’s not malice on the part of the companies, you know, like—it's—there’s only one of the applicants, and so they use these sort of crude filters, and that's, I think, the big thing that we're focused on is trying to figure out how to directly measure the skill so that we don't have to rely on, you know, filters like where someone's worked or what school they went to.
And what about things like, you know, for example, locations? Let's say I live in Salt Lake City, and I'm interested in getting a job, possibly at Facebook. Should I put like San Francisco on my resume and just fly out for an interview? Do you have general advice in that area?
Big companies don't care at all where you're based. They hire people in, you know, by the hundreds every week, okay?
Smaller companies do show a slight preference to local candidates, and so if your goal is to work at a small startup, you’re probably at a 10-20% precise advantage if you’re based in the Bay Area, right?
Okay, cool. So from the company's side, there's like a million different interview methods that people go for. Say, we are— you know, they go through Triplebyte, they get screened, they're going to do an interview whiteboarding, pair programming, all that stuff— how do you feel about it all?
The toughest can work the writer ahead. Let me sort of give a bit of an overview here.
So again, as I mentioned earlier, the core problem is there’s this tension between the skills that can be met in an interview—solving small problems quickly—and the skill that matters as a programmer, solving large problems over a long period of time.
So the first approach you can take in interviewing is just saying, okay, we're going to— we're going to not do it; we're going to do like trial employment or something like that.
Yeah, and that totally works. If you work someone for a week, you have a far better read on their skill than I think anyone can get during a three- to four-hour interview.
The problem is that there’s a pretty strong bias in who's willing to do trial employment. I mean, it’s adverse bias. Some of the—so many of the best programmers have lots of options, and if your company requires that everyone do this trial employment period, most of them are not going to—just going to say no.
And obviously, anyone who currently has a job, a family, for a way to do it can't leave for a week. Yeah. And of course, there's also, you know, you’re committing a week of time, and so obviously, you need some filter before the trial employment.
And so I think, in the end, we're left with a thing kind of like that, you know, like the famous, you know, democracy is the worst form of government except for all the others. I think that, you know, interviews are the worst way to evaluate engineers except for all the other options here.
And so we're less of like you have to do it. It's fundamentally inaccurate; you still have to do it. It's—because to make it as accurate as possible.
You know, and so once you’re on that page, we see two sources of noise. We see noise that comes from the companies being inconsistent, so let’s talk about that a bit earlier.
It is still too often the process that engineers are responsible for coming up with their own questions. It seems like you’re asking every candidate different questions and again coming to a gut call. There's just far larger than anyone really realizes.
Source of noise, and so like if you ask, you know, pick any company that has that process, you ask them to read and somehow re-engineer. We interview their colleagues, you know, in a blind fashion, right?
They would have a, you know, likely have a 50% pass rate before it's entered college, would be screened out.
And so the solution there is just to be really consistent. Okay, so the company likes to make sure that you’re asking everyone the same question, and make sure that you’re valuing it in the same way.
I think that's more important than what you're actually asking, right? So, the first step is to be consistent. The second step is to tweak it out over time based on the results you see.
Okay, you know, once you’re doing that, I think the other source of noise we see is companies looking for different things, right? So, as an example, earlier you have a company just looking for super academic engineers to practical engineers.
You have companies that think that, you know, all skill engineers need to know about how operating systems work. We have a company that thinks that, you know, they only want people who’ve experienced compiled languages; you have a company that hates compiled languages.
I think they’re opening your companies to want people to use enterprise languages, your company, because mess—yeah.
And so I think the important thing is to untangle which of those are conscious decisions you’re making about what you want to hire.
So your banking company, you want to be focused on, you know, QA processes and safe code. It probably makes sense to reject one for being too eat when the cowboy—you know, you’re a social media company; your goal is to move really fast.
You know, maybe you decide to have a culture; we want to move faster, break things, and you want to hire a cowboy. You know, those, those are, you know, logical decisions going on, but very often companies are making similar kinds of decisions almost by accident.
It’s sort of introspection—deciding, okay, like, there’s all think about what you want to hire, those people, and then designing the process to look for it, and so in your examples, whiteboard coding tends to skew toward the academic.
It tends to give preference to people who are really good at breaking their thoughts down into this sort of structured academic way and write it with a small amount of code; so you often have people who are actually really productive, excellent programmers—who look really stupid and bad in a whiteboard interview.
If you're not looking for academic skills, it probably makes more sense to put people on the actual computer and see how they actually work in their environment.
Okay, and so, what does— I get the underlying question from me—like, could you engineer the perfect interview? But I wonder, like, what does the interview for a job at Triplebyte look like? I mean, I imagine you made it, right?
Yeah, okay. Well, so first of all, I want to go start with all our candidates go through a regular process. So we hire... There are people out of our regular stream.
We compete at edits companies, so they interview them to us as well as other companies, and which is kind of fun. So first, they go through a regular process, and so we already, you know, have a pretty strong sense of how they are in those areas.
And then, just like, you know, my advice generally is to decide what the skills that you preference. And so I think we preference, I—couple things.
We preference data analysis is pretty key to our business, and so we preference people being comfortable and familiar, you know, talking and thinking about data. That’s to use a bit more academic, I think, than what many companies hire for.
And then we, because we’re in the business of evaluating knowledge really broadly, we then preference breadth of knowledge, I think, to a greater degree than those companies need to.
And into what does that mean in practice? Like, what questions would I be looking at?
Yeah, so we, again, seven goes through first our standard process, so we had that gives us a pretty good read on just, you know, corrective programming output, general knowledge of computer science, geology, custom design, and then we then, in our... we do additional follow-up, you know, on-site for the candidates, and that goes much more into depth into data, or if we’re hiring them for a different role, but sometimes hire folks who’d are not working written in your data, so if we're hiring to be sort of as a front-end developer, sort of into depth into front-development.
So here, here’s a, you know, here’s a spec for a front-end project. You have two hours—build that. Or if they’re going to be back in specialists, here’s a back-end spec.
Okay, sure. And so, as an engineer, should I be paying attention to like every new thing that's coming out? Is that going to be of importance when I'm doing an interview, or should I be paying attention to whatever, a medium amount of it?
Well, we had this interesting longer term, and so one thing a class people we see; interestingly, we see people who have whom thought about that same thing ten years ago, right—made the decision to not keep up and now doing the serious change.
And then these folks, you know, or maybe still using, let’s say, CGI, and they understand modern web stacks, and are indeed in a weak situation in interviews.
So I think, to answer your question, is day one, it's not so important. Very few companies, especially only smaller ones, are like directly evaluating flashy tech.
However, if you make decision two to four today and you know, don’t give up the date and you’ll end up being, you know, totally behind ten years from now, then you probably are going to pay a price.
Yeah, I mean, especially if you're actually interested in starting your own thing at some point—being on the edge really, really matters.
Okay, I mean, maybe this is kind of difficult to answer, but I wonder about, like, employee retention, engineer retention—are there any qualities that you found you can bet someone and say, like, okay, I think the average is like 18 months or something for someone to stay around. Are there qualities that correlate to longer-term employment?
I'm not having to start data on this recently, so there’s going to be a little bit sort of off the cuff.
Yeah, I mean, just the obvious things. If you know candidates who are excited about the mission and the actual company, you have a, you know, higher probability of staying than candidates who are chasing nice paychecks because local examples, you know, sometimes they're awesome engineers who are looking for a place to really commit, but I also want to make a fairway.
So, it’s a holiday. This is all complicated, but yeah, look, you're looking for and I think the number I think. I'll just say, looking for engineers who are excited about the company and the job.
Okay, cool. So, kind of just wrapping up, I wonder, are there any books or things that if I'm an engineering manager, I'm going to be running a bunch of interviews that I should really dig into and I can get a lot out of?
I have not actually found any books and I think are very useful. I think we've just found—maybe arrogant, but I think engineers bring a search field where it's so easy to say things that sound profound that are not true.
Okay, I truly believe that, like, 80% of what’s written out the part interviewing it doesn’t actually call it.
So for example, yeah, it sounded like a really good idea. A lot of engineers love to write is the statement that interviews don’t make any sense at all and you just look at work some... someone has done in the past.
You know, we test this a bunch. We tried to drive scoring engineers and I haven't talked about past projects and scoring them, and try and even go even like with a full hour at going in-depth in the project during technical details.
Scoring just talking—skill ability to spin a tale ended up dominating the actual engineering rigor, and this was far or less predictive of job performance than giving them a relatively simple programming assignment.
I mean—and then it kind of sucks, like, I don’t really like that the case. You can find so many articles up net about this. It's stupid that we're asking engineers to do these interviews—why don’t we just have them talk about their past experience.
Yeah, I just—if you test it, it doesn't hold up. And so, just for the sake of keeping standard, right? What do you tell people to do in that way when they're conducting an interview?
Well, yeah, yeah, I mean, you're standardized, right? So, well, you need to be really careful helping people trustingly.
So great, certain candidates are a lot better at listening help without necessarily realizing they were helping them. This is something we have added with a bunch, actually, and so we have it helps the review. Again, we're doing thousands of variation, so it's easier for us to do this.
But being, we have our comment decision tree of all different ways anybody can go and like what help we’re allowed to get or not allowed to get, just—it’s big sorta noise outside of doing a thousand interviews and standardizing.
I’m not sure I have a really good fix for it, but be aware that some really compassionate candidates will be like, okay, so…
Okay, so what’s the common area where I might ask for help without you even realizing that I'm getting help? One is being brave enough to ask, so saying, like, being really friendly.
Yeah, and then saying something with confidence—that’s sort of right—and then getting it. Just natural instinct to add on to and correct the error.
As the interviewer, it's really easy to do that and not realize that you're, like, steering the person through the problem.
So if you’re going after interviews, you should do exactly that. There will be a blog post on how to prep for an interview than I do, right? I recommend trying to do that expertly.
Oh yeah! I want to add a sort of side to that though, which is like, actually, the negative side of that, which is that anybody could turn into hazing.
Not just evaluation; it’s also like this rite of entry into a company. Some company has developed this culture around the interviews being hard and unpleasant, and as the interviewer, it's really easy to forget how much harder it is for the one you’re answering the question.
It's so much easier to feel smart when you're asking the question, and sometimes candidates get really flustered and answer a question. It seems likely, like, I’d be really frustrated at the interviewer if they’re like, this thing is obvious in front of them—they’re missing it—and they're wasting your time, and you can get a bit angry inside.
It’s just really important to stay away from like the hazing—like taking out anger on them by like, you know, I'm generally against coming to view short.
Actually, I think is... I think—I think I think except for in the case where the Kenda is in pain, I think it's not worth doing. I think you save some time but you damage repetition.
You know, they basically just like, it’s embarrassing.
Okay, but definitely staying away from the hazing, staying away from like food.
So does that just mean like crazy brain teasers? Does that mean like cutting them off in conversation? What does that mean?
Yeah, I mean, it means all those things, right? So it means crazy brain teasers; be mean—sort of getting slightly angry and aggressive in how you went to their questions because you're frustrated by how poorly they're doing.
A trick that we use that I think helps in that case is just, sort of, in the case where a candidate is totally failing the interview, like flipping a switch in your brain and going from like evaluation one to the teaching mode.
And you’re, like, your full goal now is just to explain, and as friendly as positive, like generally any other person. Say, alright, this happens when the person has already essentially failed at least the problem, if not the interview.
Sure, so you look, you’re already having it in the brain happen—okay, this person is not passing, so I'm going to spend the remaining 15 minutes being friendly and explaining the answer to this question, rather than continuing to try to elicit responses from them.
And what about the dynamic? Do you advise one-on-one interviews or how many people per interviewee?
Multiple panels definitely creates more stress. Yeah, so we max out at two to one. So it’s in place training is important, right?
So if you’re trying to keep it consistent, you need to have continued across—you need to cross if people need to watch each other’s interviews. But to sort of one interviewer, one shadower is enough to do that.
You know, going beyond that increases the stress, and I don’t think it really helps.
Cool! So if people want to follow up with you and ask you questions, how can they reach out to you?
Sure! My email is almond@triplebyte.com.
That's A-L-M-O-N-D.
Thanks, man!
Thank you!