The Sequel Show

Building a better future for interviewing and evaluating technical talent w/ David Corea, senior analytics manager at CoderPad

Episode Summary

This week’s episode is a great one if you or your team is focused on hiring data practitioners this year (like most teams). For EP 23, I had the chance to sit down with David Corea, senior analytics manager at CoderPad, a technical assessment software aimed at improving how we interview technical candidates. While CoderPad isn’t specific to interviewing data practitioners, it’s a great example of one of the many ways the data industry can learn from the best practices of the software engineering world. In the episode, we look at how CoderPad uses data to improve their product, how we as interviewers can minimize interview stress, how to measure learning in an organization obsessed with progress, and the importance of the five whys.

Episode Notes

Some of our topic highlights include:

As always, I'd love to hear your thoughts on the episode over on Twitter @borisjabes.

Want to discuss the best practices we covered in this episode? Come hang out in The Operational Analytics Club, where all your favorite data leaders gather to sharpen their skills. 

Know someone that you think would be an awesome guest on The Show (hint: you can totally nominate yourself)? Reach out to our content and community team

Resources:

Music by the talented Joe Stevens.

Episode Transcription

Boris Jabes, CEO and Co-founder, Census:
Hey folks, this is Boris Jabes and you're listening to the Sequel Show.

Boris Jabes, CEO and Co-founder, Census:
This week, we focused on a topic that's likely important to a lot of you, which is how to get better at hiring technical talent. I spoke with David Corea, the senior analytics manager at CoderPad all about this. Their company builds a product that makes evaluating candidates easier, more collaborative and more equitable. And we discuss things like why running technical interviews seem so difficult, how to interview for a data team, even what we should do as interviewers to help candidates. And of course, this is all about data. So we talked about what analytics teams should do to track and improve the hiring process, which I think is something that gets lost in a lot of our conversations about how data teams can help the company. So this was a really fun conversation. I hope you enjoyed as much as I did.

Boris Jabes, CEO and Co-founder, Census:
Hey, David, nice to see you. Why don't you tell folks who are listening kind of the short story, who you are, what you do, where you're at?

David Corea, Sr. Analytics Manager, CoderPad:
Yeah. Okay. Cool. Well, I'm David and I'm the analytics manager at CoderPad. I think, I live, breathe technical interviews and how to make them fun and modern and a pleasant, general experience. And I live and breathe data all day every day. I mean, I guess that's why we're talking though, at the end of the day, isn't it?

Boris Jabes, CEO and Co-founder, Census:
Sure. You've already made me think of two immediate questions. So what is a technical interview? What does that encompass?

David Corea, Sr. Analytics Manager, CoderPad:
Yeah, that's a great question to lead with too. Imagine you're interviewing somebody and you want to get a good assessment of technical skills. Can this person, and we focus on developers. When I articulate technical interviewing today, we think about can this person solve the problems that they claim they can solve in whatever language matters to the business, or that is meant for this role. And traditionally, that question tends to get solved with like, here's how I would do it. And it's a description. And in the last 10 years, it's evolved to the white board interview where someone will write up something in whiteboard marker and then just back and forth, chatting over a whiteboard. But neither of those experiences that I've described actually captures what day to day work looks like for a developer.

David Corea, Sr. Analytics Manager, CoderPad:
And so when I say technical interviewing, I mean, can we assess or can anyone genuinely assess how a person's hard skills will actually look like in day to day work and what I live and breathe every day and the rest of us at CoderPad think about is how can we best support making that easy for an interviewer and fun for someone who's actually going through that interviewer.

Boris Jabes, CEO and Co-founder, Census:
Wow. That's no small task of how to make an interview fun. I like that's a core goal though, of CoderPad. That's nice.

David Corea, Sr. Analytics Manager, CoderPad:
We like it. And I guess it's part of the culture. You have to want to make interviews not be scary and feel very much like what day to day life is like for a dev. So someone generally in software is probably what I should say because we also support languages that maybe aren't part of, there's Python there's Haskell, but there's also SQL and that could be a marketer.

Boris Jabes, CEO and Co-founder, Census:
Hold on, wait, are you telling me, you put non-engineer titled people through CoderPad interviews, for example, at CoderPad? If you hired analytics engineer, are you putting them through a CoderPad interview?

David Corea, Sr. Analytics Manager, CoderPad:
We could. And, granted at CoderPad, we would, and we do see customers of ours do the same thing. Arguably I would encourage folks to consider not putting an analytics engineer, to use that example, through the whiteboard interview, because what are they doing on a day to day basis? They're writing SQL and they're writing through some language or using some tool to make data go from point A to point B in some meaningful way that supports the business or supports stakeholders and the whiteboard interview, the casual conversation, doesn't demonstrate that person A going through the interview can actually do that. So the intention is open up a pad, pick SQL and then no biases you can see right then and there, can this person do it?

Boris Jabes, CEO and Co-founder, Census:
Yeah. Do you think the issue is that people don't use this approach or they do it, but they do it wrong? That most companies, when they try to interview people that have, as you pointed out hard skill requirements, let's call it software engineer, analytics engineer, something with engineering in the title for now let's say, are they not doing hard skill tests? Or are they doing them poorly?

David Corea, Sr. Analytics Manager, CoderPad:
Well, I don't like to think about a negative term, they could be doing it better. How about that? And that's genuinely how I feel, because you can assess someone by like, can you write good SQL or can you write good JavaScript, by forcing someone to write up something on a whiteboard or a piece of paper, but that's not what the day to day is like. And in fact that's just not what the workflow is like. And so why not write code on a keyboard and talk about it together in a mutual environment where there's the code in front of me and can it solve the problem? Yes.

David Corea, Sr. Analytics Manager, CoderPad:
So are they doing it wrong? Well, that depends on your perspective of what is correct and what is wrong. Mine is if you're forcing a developer who normally works in an editor where they write code with a keyboard to demonstrate that ability, but with completely different tools in already a stressful environment, could you be doing a better job to assess this person? That's on the interviewer side. And on the candidate side, that's not a recipe for making someone feel more comfortable. So I would argue keeping it where the developer is in an environment that's familiar.

Boris Jabes, CEO and Co-founder, Census:
Yeah.

David Corea, Sr. Analytics Manager, CoderPad:
Empowers that person to show off how good they really are.

Boris Jabes, CEO and Co-founder, Census:
Yeah. I think your point about interviews are kind of adversarial when that doesn't help anyone. It doesn't help the company either. Unless you're the kind of company that thrives on adversarial culture, which I'm sure there's a few companies for whom that is actually core, maybe super kind of internally competitive companies. But it's kind of a tragedy to comments, I would say that, interviewers and interviewees are adversarial because neither of them wants that. Everyone wants you to do your best work. I'm trying to see you at your best, not at your worst. That's really not my goal. I mean, again, maybe one interview you want to test out, how do they do under terrible pressure, but I'm kind of curious as what have you seen that causes an interview to be more fun, less difficult. So I think you said one thing, which is if I get to use the tools I'm comfortable with, I'm going to be more comfortable. I think that's fair. I think I agree with that. Is there anything else?

David Corea, Sr. Analytics Manager, CoderPad:
That's the big one for me. Well, I think about, it's funny because you use the word adversarial and that gets my gears turning. Would I be best friends with the person interview? Would I want to hang out with the person after the fact? And the if in practicality every single interview is typically a hiring manager or some colleague that could be a colleague, and then there's me, or there's the candidate. And the goal is, it's a two way street. The candidate's also assessing would I want to work with these people? And I think the point you make is a good one. If an interview is not a comfortable environment, we're not setting up either the interview or the candidate to make a good call about whether or not this is genuinely a place where they could do their best work. It's just very much in the nature of an interview. It's a bit of a stressful situation.

Boris Jabes, CEO and Co-founder, Census:
Exactly.

David Corea, Sr. Analytics Manager, CoderPad:
So what have I seen? Well with CoderPad, we don't see the interviews, but we-

Boris Jabes, CEO and Co-founder, Census:
You mean, wait, CoderPad's not just getting all the best engineers who interview at other places and sending them over. That's not part of the system? Kidding, kidding, kidding. Just kidding. Just kidding.

David Corea, Sr. Analytics Manager, CoderPad:
I think what we hear from customers that do use us is, and through the surveys that we get is that candidates are happier genuinely. They rate the experience as having been something that they liked and enjoyed. And interviewers are happy because one, it's actually for a couple of reasons, but the one that comes to mind is interview prep is easier. And two, it feels like, and I'd love to talk about that. And then two, it generally just makes it feel less scary when there's something common between the candidate, which is the editor, which the candidate and the interviewer, which is the candidate, if it's a live interview, because there's also the idea of a take home project. But in the live sense where there are two people actively talking back and forth and reviewing, it's not something that's strange. That's what day to day work is like.

David Corea, Sr. Analytics Manager, CoderPad:
And so it becomes easier again, through what I see in the grapevine and what we learn from how folks are just using the product. Folks tend to be more content when the interview is more comfortable and that's the goal. We want to make it so it's easier to assess, can this candidate do the job? And then for the candidate, do I want to work with these people?

Boris Jabes, CEO and Co-founder, Census:
Yeah. So we do something very unusual here when it comes to the final stages of assessing fit, which is we actually do a half day hackathon project with the prospect, with the candidate, in which we put in which we put two full-time engineers to work with them. So we're putting in more human hours than they are, in a way. And the goal is to be as collaborative as possible. It's really intended to give them a day in the life. It's the closest we have come up with to make you feel like you're in the office. And even then when I ask candidates afterwards, they always say it was really cool, it was really unlike any other company. But they still are like kind of stressed because they're like I'm being interviewed.

Boris Jabes, CEO and Co-founder, Census:
And really the goal is to say, I'm trying to get across it. You already passed the technical bar. This is about just working together. And it is a two way street. We're trying to show you what we're like, because you can't hide in a four hour kind of project. You really can't hide who you are. You can try, but we can't hide and usually the candidate can hide. And that's kind of the goal, but I'm very into this idea of what can I say, how do we prep the interview, the interviewee, to understand that we're on the same side here. This is really we're trying to evaluate fit. And it's not about... If we don't fit, it's not because you're not good. At this point, it's really like a vibe and do we like enjoy working together? Are we going to be, like you said friends, et cetera. And yeah, I want to put them at ease as much as possible, but I don't know how to necessarily do that. You can explain all these things, but people are still stressed out.

David Corea, Sr. Analytics Manager, CoderPad:
It's a fact. I mean, the goal is to minimize the stress and I'm willing to bet that as the four hour process that you described goes on the tail end of that, the candidate is probably feeling a whole heck of a lot better to be near the end of a hackathon project and might actually be bonding at that point. And it's impossible to remove stress from the interview process, but I actually love what you described because again, in the beginning, yeah. The stress levels might be where they are but you go towards the end of the process, I'm willing to bet that the group of folks is talking about grabbing lunch at the end of it, or is there's high fives being thrown around.

Boris Jabes, CEO and Co-founder, Census:
They're starting to think about what this project could become someday in the future. And it's like, yeah. Oh yeah.

David Corea, Sr. Analytics Manager, CoderPad:
So I actually love it. The time invested is amazing. The question that we try to answer is how do we minimize stress at scale? And we have some thoughts. Our product is built around what we hear from customers. And the features that are in CoderPad, they come from ideas like what you just shared. These are the conversations we love having. Our product team, this is what the day to day is like. And so questions that can be shared across an organization that get used over and over creates this notion of we're mitigating how biased we might be with our own questions that we might ask. But because we're all sharing and kind of thinking about the same kinds of problems or in the example of an analytics engineer, again, if all companies did the let's share half a day, that would be amazing.

David Corea, Sr. Analytics Manager, CoderPad:
I'm not going to say that. That would be fun at the end of the day. Relationships have been built. But in the example of the analytics engineer, having a common set of databases that engineer has access to helps to mitigate bias against the different interviews that you'll give to different candidates, because it's the same question. And the answer has become almost like did this person write SQL or Python or R or whatever, when they fetched this data and queried it and produced some insight. Are they similar? Did they arrive at the question in the same way that the other candidates did? It becomes less about, did they answer it the way I would want and more about how did other people compare and how did people think and so on.

Boris Jabes, CEO and Co-founder, Census:
Completely agree. I think I've tried to always come up with over the years of interviewing, you definitely want to repeat questions because then you find benchmarks, you're able to start to calibrate. And I mean, my co-founders have been using a screening interview question for over a decade. And so the calibration they have on it is so high. Because it is across three companies, they've been using the same question. It's like, they just know this is a P90 answer.

Boris Jabes, CEO and Co-founder, Census:
But you also want it to be, to your point, I find you don't want it to be too closely tied to your company's domain knowledge. Because I've rarely found a candidate who can match us because we've been spending all day every day for years thinking about it, and they're from the outside. So I think this is where you get those interview questions that are much more abstract, like let's design a coffee machine or things like that where we're on a shared ground and it's not like I have knowledge that you don't, that puts you at a disadvantage.

David Corea, Sr. Analytics Manager, CoderPad:
It's a fair point.

Boris Jabes, CEO and Co-founder, Census:
That database that analytics engineer should use should not be full of CoderPad kind of isms. It should be probably something somewhat generally understandable.

David Corea, Sr. Analytics Manager, CoderPad:
So yeah. I mean, so just again, full transparency, we try to make that easy again for folks, the suggestions that we offer are sample questions and sample databases there's in the world of SQL land, there's the classic employees database, or in the world of machine learning land, there's the Iris data set. So, there are the common things that if you're in the field, you've probably seen something related to this.

Boris Jabes, CEO and Co-founder, Census:
That's nice. You can benchmark industry wide in some way. That's kind of a neat aspect as well too.

David Corea, Sr. Analytics Manager, CoderPad:
That's kind of my aspiration with how we could use our data here. I would love to be able to at a macro level start talking about like, oh, here's what hiring is looking like over the past five years and the trends that we're seeing, and that's coming. So on the subject of the rich data set that we have at CoderPad, it does get to become a very cool thing to talk about.

Boris Jabes, CEO and Co-founder, Census:
Yeah. You must have some fun data points. I'm going to make one up and you're going to tell me, I hope you have this or that you have it at your fingertips, which is what percentage of interviews do not end, or end abruptly?

David Corea, Sr. Analytics Manager, CoderPad:
Ooh, that is a good question. I don't have that at my fingertips. And when you say end abruptly, as in it's shorter than expected?

Boris Jabes, CEO and Co-founder, Census:
Yeah. I feel like the "did not complete" would be a really good, hard metric right of something went very wrong. The "did not complete" should be near zero.

David Corea, Sr. Analytics Manager, CoderPad:
You hope.

Boris Jabes, CEO and Co-founder, Census:
You would hope. You'd hope.

David Corea, Sr. Analytics Manager, CoderPad:
Yeah. I hope that's as close to zero as possible, but I'm certain that exists. I mean, I hope that if folks are, and this is again, just my heartstrings talking, if you have an interviewer and a candidate and they're engaging in a conversation, it would be terrible if like it you ended it early or one or the other, but you know what, though? I haven't looked at it. I'm confident that probably happened. I mean, there are horror stories with interviews.

Boris Jabes, CEO and Co-founder, Census:
Yeah, the law of large numbers. You will find, like you said, you're working at scale here. So you'll see all variants have occurred. Well, okay. Then what are some data points people should be gathering, as someone who does interviews, rather than for CoderPad for a second, what are analytics everyone should have in terms of their hiring process?

David Corea, Sr. Analytics Manager, CoderPad:
The thing that comes to mind for me, how effective is a question that we're asking at assessing if a candidate is up to snuff or not. I think to the famous story that Google, this has actually been on my mind recently, Google used to ask riddles and brain teasers and-

Boris Jabes, CEO and Co-founder, Census:
Yeah. So did Microsoft, where I started my career.

David Corea, Sr. Analytics Manager, CoderPad:
There you go. And famously right, there was a lot of PR around, "We will no longer do this", because we learned it doesn't help assess how skilled a candidate is at their job. And I think, again, if we could support at scale giving folks that insight, here's a bunch of questions that you've been asking and here's how candidates are rating them. Off the top of my head that's a data point that could be empowering across the board now.

Boris Jabes, CEO and Co-founder, Census:
I mean, you talk about ML, in some ways if a company is large enough, or maybe you can do cross company knowledge but maybe not, you'd want to know if a question that people succeed at is actually a poor indicator of future performance or vice versa. A question they did poorly on was a poor indicator of their future good performance. Because that's the ultimate metric is are they a successful member of our team? And was this question informative of that. So some longitudinal data that might be very interesting.

David Corea, Sr. Analytics Manager, CoderPad:
I mean, yeah, you're hitting it on the nail. The idea is, there is some knowledge if you're using the product, what did you ask candidate X and then was candidate X ultimately successful? And again, because we have that at scale, that's going to become something that we will be excited to explore in the future. But yeah, that's the sort of stuff that we hear from customers, in fact. It would be great to know this.

Boris Jabes, CEO and Co-founder, Census:
Got it. So people, they ask interview questions and they're genuinely like, I don't know if this is a good interview question, basically?

David Corea, Sr. Analytics Manager, CoderPad:
It's a common theme actually. And I'm saying this a lot, but a lot of what drives what happens at CoderPad comes from people that are interviewing with CoderPad saying, I would love to learn more about what I'm doing right and what I'm doing wrong. Or it would help me out if I had this library to play with, this software tool to play with. And a lot of that is what drives what we do next. So on the subject of being proactive with our data, certainly that's a guiding light, but at the end of the day, if it means nothing to customers, it'll become something operational internally we'll talk about. I'm over here talking about how I think it would be great to be able to rate how effective a question is, or as an example. But if customers don't want it, then what's the point. Now I'm giving you an example of something that I know people like but-

Boris Jabes, CEO and Co-founder, Census:
I mean, I think you're setting yourself a very high bar. That's a question everyone wants to know, but that's a hard question. You're starting with something difficult. I like the kind of ambition you have running the data team there to answer questions of that magnitude, there's got to be, what's the run of the mill responsibilities of the data organization at CoderPad? What are KPIs that you are kind of tracking?

David Corea, Sr. Analytics Manager, CoderPad:
Sure. I mean, are people interviewing? And how are they rating the experience? So the basics.

Boris Jabes, CEO and Co-founder, Census:
Yeah.

David Corea, Sr. Analytics Manager, CoderPad:
The basics, how often is the product being used and by whom and are people happy, those are super key metrics that anyone should be able to answer at CoderPad. And the idea, that's a regular conversation, whether it's at all hands or operational meetings. So a much more run of the mill question. Granted, we like the hard questions too and I like to think we're an aspirational and optimistic company. If we can do things that bring a little bit of delight to both the interviewer and the candidate and make interviewers be a little bit less scary, then we're doing an amazing job.

Boris Jabes, CEO and Co-founder, Census:
Well, yeah, I agree with you, but I think even there you're underselling it. It's not just the optimism, I think, as a data professional people who come to work for you in the role of analytics engineer, data engineer, analyst, you name it, what's the motivation? What's the motivation to go in house versus a consultancy versus anything. And I think the idea that there are questions that the data team will eventually be able to answer that are fundamentally unanswered in the industry is super exciting. And I would be motivated to spend time on the team, even if, of course, day to day you have all sorts of other KPIs to kind of compute. But to me, I am always heartened and disheartened. It's like a weird combo that the world is nowhere near as sophisticated as we think. So we've been hiring and interviewing and building the software industry for 40 years and speaking to you, it's like, yeah, we don't really even know if that question is a good question.

David Corea, Sr. Analytics Manager, CoderPad:
It's a fact. I mean, that's kind of the fun and what makes the job difficult, but that's the nature of any scientific pursuit. You don't know if it's going to come out giving you something awesome that it can be usable. The classic term, this is an actionable insight or if it's a dud, but that is half the job understanding what are the questions we can tackle.

Boris Jabes, CEO and Co-founder, Census:
By the way, that's a really interesting meta metric for a data organization is if every piece of insight you derive is actionable, you're probably not trying enough. You should be generating duds. If every time you swing is always a hit it's like, then you're probably not taking enough swings, is the idea. Yeah. I hadn't thought about that until you just said it, but yeah, sometimes it's exploration. It's a bit of a scientific process and we might uncover interesting truths. We might not.

David Corea, Sr. Analytics Manager, CoderPad:
That's been a common theme across different jobs I've been in. Because I've been fortunate enough to sit on data teams at different classic Silicon Valley startups. And the common theme that you make me think about is don't aim for an A+, do well at your job, but make mistakes. And do research, carve out time to see if you can learn something, even if it doesn't lead to anything actionable, learn something and the worst thing that's going to happen is you've gained some knowledge that we can't use today.

Boris Jabes, CEO and Co-founder, Census:
Yeah, no, there are a lot of ways you can stop to internalize, what's that? There's that famous Edison line, it's like 99 tries to get the light bulb, but it's like all those failures are on the path towards the success. And I think the hard part as leaders, whether it's team leaders, company leaders, is how to foster that meandering process that may or may not be a constant set of hits is difficult. It's difficult. It's not an easy thing to do, but I think it's the correct kind of mindset.

Boris Jabes, CEO and Co-founder, Census:
Is the scientific process, I think we've talked about this before on some of the conversations we've had, is it important for people on your team to understand, let's call it hypothesis test kind of, should everyone be versed in some basics of experiment? Kind of the theory of an experiment?

David Corea, Sr. Analytics Manager, CoderPad:
I mean, to some extent I would hope so as a person working in industry and not academia, I am motivated to practice some level of rigor, but also I'm motivated to do the minimum amount of work as possible to get to an actionable insight that's helpful.

Boris Jabes, CEO and Co-founder, Census:
Sure.

David Corea, Sr. Analytics Manager, CoderPad:
Do what's appropriate, but don't spend three hours trying to answer how many users do we have. That would be an enormous waste of time. We should know that with very little effort, to use a kind of convoluted example, but there's kind of a balance. I would think one should be versed in and kind of to some extent if data, people, scientists, analysts, engineers, thinking about how can I be rigorous and follow some scientific method to use what you're alluding, to get to an answer, but at the same time, how can I quickly get to an answer that is actionable and meaningful and is enough to empower the person that I'm trying to answer this question for.

David Corea, Sr. Analytics Manager, CoderPad:
So if the purpose is pure research, like maybe I'm an ML engineer then, gosh your job is to be rigorous and scientific and lean more academic than industrious to use those two terms. But at the same time, if you are a data analyst or a BI analyst, your job is not to practice every strict amount of rigor and statistical approach or statistical rigor that could maybe get you to the exact same result as if you were to just look at some trends over time. I lean that the question and the context for the business is what should motivate that.

Boris Jabes, CEO and Co-founder, Census:
Yeah. I mean, yeah, I'm sure everyone on your team, based on how you're talking about is probably motivated by the core mission of the company, to make interviews better. And so hopefully you're keeping that in mind as you answer questions or search for insights. And you should have that in mind. I'm with you.

David Corea, Sr. Analytics Manager, CoderPad:
Yeah. And then the thing that I encourage folks on my team and what I hear from everyone else is have context for the business, understand why the question was asked. And I hope people, data teams all over do this. If you're just doing research for the sake of research, it's probably a more academic pursuit than it is to help the business move forward. And if that's the case, what are you doing working on a data team, if not to help the business be amazing.

Boris Jabes, CEO and Co-founder, Census:
I think the reason I bring it up is this, it's almost in the name of the role. Engineering tends to be you're building, you're dealing with constraints, there's trade offs, but you're building systems. It's not an exploration. Engineering, a bridge to use that kind of analog, it's strictly a kind of what material do we need, what shape does it need to be? How do we do it? How do we put it in?

Boris Jabes, CEO and Co-founder, Census:
Whereas in data science, in the name it says right there, it's a hypothesis exploration. If we were to do this to the app, would something change? And the answer might be nope. And so it's different in kind, and that's why I think it's different in how you would manage it because engineers, you can really focus on outcomes, but even analytics engineer, you can almost say like, did the model get built? Did the data flow? These are almost guaranteed to be achievable things, but data science or analysis may, as you said, might hopefully actionable insights, but might not. And you have to manage that.

David Corea, Sr. Analytics Manager, CoderPad:
Yeah. That's kind of the challenge to the role, isn't it? But to our earlier point, if it ends up being a dud that's okay. It's just a matter of fostering that it's okay in the culture. Granted again, context is everything. If there is someone who's made, I could imagine a case where a PM or any business stakeholder has made a gamble and has spent company money on a campaign or something to change the app in some maybe non-trivial way or a website like the shopping cart, if you're an e-com shop, that's a sacred thing. If you're changing that, which all companies should be playing with it, but if you're making a real change, then that merits a little bit of effort to have some understanding. So again, context is everything.

Boris Jabes, CEO and Co-founder, Census:
You're absolutely right. I mean, this raises the question of how does one gain the right amount of context? Because someone might say, "Hey, I want to know X", but you don't realize like, oh yeah, this is some very urgent, very important thing tied to this, that if we touch it's like going to be wrong. Versus, "I was just curious." Or you might be getting a question that is sufficiently removed from the shopping cart that you might not even realize it's tied to we're about to make a product decision about the shopping cart that's going to potentially shake the whole company. And you're an analyst just going, I just answered the question. I didn't think about the repercussions. How should one, especially in kind of the way companies are organized and how, and I'm even curious, in your case what teams, how people report to you and how they report to the rest of the company. How do you ask the five whys? How do you get to kind of the context?

David Corea, Sr. Analytics Manager, CoderPad:
Yeah. Well, I'll tell you what I do. I don't know that I know the secret sauce, but the at 10,000 feet, the answer is have some kind of alignment with the person that's forming the question. Understand what's the nature of the question? What's the real goal here? So to be less abstract, a PM wants to change something on the website. We'll go back to the shopping cart example. I want to change the button from blue to orange, whatever. Convoluted, but again, like it's-

Boris Jabes, CEO and Co-founder, Census:
We all live in Google's shadow that the color changes, change everything, yeah.

David Corea, Sr. Analytics Manager, CoderPad:
Right. But yeah, arbitrary example, but it's going to be motivated by some PM saying, I think this will impact the business in some positive way and I'm going to gamble company resources and time. Educated gamble, but that is what it is at the end of the day. This might pay off into nothing. And the question that's often asked is did this help, did this do harm? Did this not do any harm? And that's kind of it. So having the context for the PM's concern, the PM's goals and the impact to the business provides context to the person on the data team. That's trying to help with building that insight of, it did no harm, you get to make the call. Or it absolutely tanked your conversion rate. So sorry, but it's got to be blue instead of orange, to again, stick with my arbitrary silly example.

David Corea, Sr. Analytics Manager, CoderPad:
But how do you go about doing that? So every team's a little bit different and I won't say that I've got a secret sauce for it folks, but the truth is it comes down to communicating and listening, at the end of the day. And being good in some way at that. Active listening, in terms of I have attended some meetings or I've had enough one-on-ones with the stakeholder that made this change and is claiming that this AB test is going to lead to something positive. I know what that concern is. I know what that goal is because I've been in these conversations and I've heard it. I could say it back to you if I wanted, this is what that person's goal is, but that's hard to do. Why? Because we're humans.

Boris Jabes, CEO and Co-founder, Census:
Yeah. Humans specify things very poorly.

David Corea, Sr. Analytics Manager, CoderPad:
Yeah. Like me thinking about the way that I'm going to articulate my next sentence to you. It may sound different coming out of my mouth than it did in my head that's happening 24/7, if I'm talking. And that's not unique to me, the thing that we just have to do our best with is come up with some model and different companies, do it in different ways to try to control for the fact that we're human and we make mistakes when we listen and when we talk. So what I mean by the model, data teams can be embedded where you've got someone that's dedicated to a data resource, but they live in engineering land or in algorithms land or in marketing land. Or you have that classic hub and spoke where there's a centralized data team and there are people that go out and work with someone on a project, but come back and they're a resource that's not guaranteed to that pod or whatever that might be that team. Those are the two that come off the top of my head.

Boris Jabes, CEO and Co-founder, Census:
I was going to say the thing that you said almost crosscuts that is, how do you get given your point that English or whatever language people speak is imperfect and people's ability to express what they want is imperfect at more deeper philosophical level, even. Is there an onus on the data team to help kind of Socratically extract what you want? Who's responsible for explaining themselves better? Is it the PM who wants it, the marketer who wants it or is it the person on your team, and you go like, no, you have to figure out how to get them to say the right words. You have to extract that from them.

David Corea, Sr. Analytics Manager, CoderPad:
Yeah. This is where alignment becomes so important. I do think it falls on both parties, but I think there is an onus on, especially if it's an analyst working with business stakeholders. Understand the context of the question. I could hear something like, I want to know what user growth looked like after we changed the button color or what the conversion rate was after we changed the button color. But the real question underlying might be, I want to know if I'm hurting company metrics somehow. I want to know if this is improving company metrics somehow and conversion rate might have been what was articulated, but maybe there's also a concern that other people will have around units per transaction.

Boris Jabes, CEO and Co-founder, Census:
Yeah. Every second order effects. Exactly.

David Corea, Sr. Analytics Manager, CoderPad:
And so I think that onus does fall on everybody, but naturally the people, different stakeholders, will care about different things. There will be the people that care about, again, I'm sticking with e-comm and I work at a SaaS company [inaudible 00:34:06].

Boris Jabes, CEO and Co-founder, Census:
It's getting where it's safe for both of us, because neither of us works in e-comm. So we can kind of paint a picture of it.

David Corea, Sr. Analytics Manager, CoderPad:
There we go. I'm going to harp on e-com for a second, but there is some onus on the person on the data team, whether that's the analyst, the engineer, whatever, to ask probing questions, to ensure that there is as much context as there can be for what the underlying concerns really are. And again, in my arbitrary example, it might not just be one metric that the business is concerned about. It might have just been what the PM asks. And so, yeah.

Boris Jabes, CEO and Co-founder, Census:
I think I agree with you by the way. I think I agree with you. I think if you have many owners, you have no owner. So yes, alignment is a team sport, but I think we, as data practitioners, let's call it broadly, or data teams or data leaders. I think we understand the need for precise language, precise definitions, better than the stakeholders. It's just a sad reality. And so I think where I would land based on what you said is we should paraphrase what the stakeholders are asking for in more precise language so that people go yes, that is what I meant. Yeah. And that is a correct interpretation and you've tightened it up. And usually in that, adding the precision it's work. But I think asking the marketer to be like, "No, but what do you mean? How did this affect last seven days?" They might not even realize that, well the shopping cart improved, but retention worsen.

David Corea, Sr. Analytics Manager, CoderPad:
Yeah. Right.

Boris Jabes, CEO and Co-founder, Census:
That might not even cross their mind. And so I think that to me is the unique advantage, purview and responsibility of the data org. We have much broader view. Both because we understand, let's say, the statistics side of it, but also the business, you have PM stakeholders, you have marketer stakeholders. They're not each other's stakeholders. In some ways it always comes back to the data team.

David Corea, Sr. Analytics Manager, CoderPad:
Right. And oftentimes as a result of that, and I love that you said it that way because different people will own different metrics, but that doesn't mean that's the only thing that the data team should be accountable for reporting and understanding. And so again, to your point, if shopping cart net conversion, went up but retention tanked. If you have a high powered analyst or someone that's involved in the business and has some understanding of what the business' concerns are, that should almost be like a thing that just automatically almost pops up in conversation like, oh, okay. Yeah. Well, we should look at all these other metrics too, just to, yeah. Philosophically, I do believe that's a good thing, that level of being embedded and that knowledge that a data person should have about the business, that's the mark of success of a good data professional.

Boris Jabes, CEO and Co-founder, Census:
That's interesting. I mean, you and I were talking about this a bit earlier, but that involves, there's a human element to that. So if you're embedded, but so you're in the team, but kind of not in the team because your manager's not the same manager, or whatever. And we live in a hard enough world in terms of how to bond as people in these last two years, but what are, let's call it tips and tricks, that you've seen work for embedded analysts to kind of get tighter with their colleagues because they don't technically share a manager. They know you're embedded. They know you're the kind of that person from the other of the team who's just here to help us.

David Corea, Sr. Analytics Manager, CoderPad:
Yeah. It's funny, it varies by company. The reason I say that is because we were also talking earlier about how CoderPad is a distributed company. And so-

Boris Jabes, CEO and Co-founder, Census:
From the beginning.

David Corea, Sr. Analytics Manager, CoderPad:
From the beginning and so they're the casual shoulder tap, "Hey, let's go get pizza." That's not a thing if you're geographically distributed. We've got folks in France, we've got folks in, I mean, quite literally all over the United States and that takes a different level of effort than if we're in the office and I tap someone on the shoulder. So in the, we'll call it the traditional situation where people are in the office and there's FaceTime. FaceTime is a big deal. So I often say the tips and tricks is relationship management is part of the job.

Boris Jabes, CEO and Co-founder, Census:
So true. So true.

David Corea, Sr. Analytics Manager, CoderPad:
You're going to have a very hard time as a data person without trust and without a relationship where a person can, we're a different stakeholder, the marketing person, doesn't trust you. And that can happen. Enough mistakes can get made that a need to manage a relationship.

David Corea, Sr. Analytics Manager, CoderPad:
So how do you do that? So it depends across the board, but I am a huge proponent of the casual, let's take 30 minutes and just hang out and get lunch and that's not work related. That is something totally off the grid and you're not necessarily doing work, but just the casual I want you to know how I think, I want you to know how I operate, goes a long way when the pressure comes on and there's a need to understand a metric and talk about some. I like one, just informal meetings. And then I like a lot of just, "Hey, let's do some working sessions together. I want you to know that I care about this and I want to do my best to help you."

David Corea, Sr. Analytics Manager, CoderPad:
So what does that bubble down to? I guess it's just meetings and I don't encourage extra meetings, but at the same time, there is value to collaborative working sessions. Maybe that's the better way to put it. Not to use more buzzwords, but it's going to be the case that a marketer or someone on the product team or someone on the engineering team will build trust. If there is mutual time spent understanding concerns on working through problems together and I'm a huge proponent of that.

Boris Jabes, CEO and Co-founder, Census:
I mean the way I think about it based on what you said is within a discipline. So if you had two engineers who need to collaborate, we have a shared physical almost artifact, the code, the commit, the PR. And so that's an unfair, advantage within a single discipline to collaborate where there's a piece of shared context that does not require a meeting. Because we're both iterating on the same effectively physical thing that we both have the same understanding of because two senior engineers understand code. That's just, unless something's gone very wrong in your hiring process. But in the examples that you're talking about it's two different disciplines. And so there's no shared substrate. Someone says, I want X, and that doesn't mean those words, that thing doesn't exist.

Boris Jabes, CEO and Co-founder, Census:
And you could say, well, a spreadsheet is a shared thing, but the ultimate thing that is always shared between any two humans, even if they don't share any other thing is the whiteboard/the talking. And so I think, yeah, there's more of an onus to resolve impedance mismatch on how you think and how you work, in this kind of cross discipline environment that you wouldn't have as much of in a single discipline.

David Corea, Sr. Analytics Manager, CoderPad:
I appreciate that, actually. You've got me thinking about it too. The data teams tend to operate in their own space. And I'm not going to use the term silo because that would mean that the data team's not doing its job, but it's true. There tends to be a separate repo with just SQL in it or some Python code that's around data transformations or something. Yeah. And it's not necessarily the app or on the marketing space. What's the common thing? It's the dashboard, it's the KPI and the conversations around those metrics.

Boris Jabes, CEO and Co-founder, Census:
Which is the outputs, not the inputs.

David Corea, Sr. Analytics Manager, CoderPad:
Right. Yeah, exactly. That is what I was going to get to. You've got those gears turning in my head. I feel like I'm going to go to sleep thinking about that now, which is not to be silly, but in all honesty, what are the common artifacts that the data teams have with other disciplines? And it tends to, you said it wonderfully, I'm going to walk away with that. This was a great conversation for that.

Boris Jabes, CEO and Co-founder, Census:
Flattered.

David Corea, Sr. Analytics Manager, CoderPad:
You must get that all the time.

Boris Jabes, CEO and Co-founder, Census:
No, by no means. I think it's like I have the advantage of listening to a lot of these and observing and looking at it from like I'm on my little perch over here. I can just watch, I'm not under pressure to deliver your KPI. And I mean, if you want my perspective, if your product is trying to make interviews better, my product is effectively trying to make the interface layer between these two sets of humans better. The people who run the business or run the product and the people who generate the data. And so it's like, I need to think about this a lot, because where is the limit of what my software can do versus what the people have to do?

Boris Jabes, CEO and Co-founder, Census:
And then I think there was a blog post years and years ago about I think it was something called like Slack is an Else Statement. So it's like you use this to collaborate or this to collaborate or this to collaborate. Basically if you can find a perfect functional collaboration tool, you will use it. Like CoderPad for an interview. And then in the absence of a bespoke tool to collaborate, you will just devolve into messaging, so Slack. And I think, yeah, data professionals, unfortunately their collaboration is with people who don't share the same tools. And so yeah, I think you end up at the metaphorical whiteboard and a lot of talking and a lot of hopefully writing down.

Boris Jabes, CEO and Co-founder, Census:
Do you have specs? When people request something, do you force it into a physical form? Just to put it in perspective, like when I was at Microsoft years and years ago, because it's a machine. It's so many people. So you create patterns and templates and processes. And so PMs would write these massive specs back in the day. This is all kind of shifted now. But back in the day, when software would shift, once every three years, these specs were full on documents that kind of thought through all the implications before the code was written. So it was like 50 pages it was these massive things and they had sections and that were built in. You always had to think about like, well, how's this going to affect internationalization? How's this going to affect performance? How's this going to affect security? It's like a checklist to make sure the kind of average PM made sure to think about these things. Is there something similar? Is there an intake kind of document that data team should provide to their stakeholders of like, you want X, you must answer these 12 questions first.

David Corea, Sr. Analytics Manager, CoderPad:
Oh, yikes. I'm not that harsh. I say that with the fortune of the fact that we're not so large a company that level of process needs to be put in place. I spoke earlier to alignment. I think the better part of my job is not when I'm heads down trying to answer questions, it's when I'm talking to people and understanding what their concerns are. Everyone's got a metric that they're accountable for at this company. And that's not unique to us. My aim is to understand those goals and to care about that as much as the stakeholder does. So when forms come and there is one, but it's got all of like three questions, what do you need, when do you need it by, and give me some context.

Boris Jabes, CEO and Co-founder, Census:
Draw the rest of the owl here, please.

David Corea, Sr. Analytics Manager, CoderPad:
Yeah. The reason I asked for that, but the truth is when these questions come, they're not really surprises today because I have the good fortune of working on a team where I basically can say, I know everybody, and I'm talking to everybody. When we grow, and when we grow to the size of where I need that kind of process, my answer may change. Yeah. But I think right now I'm lucky enough that I don't have to pull that Microsoft card yet.

Boris Jabes, CEO and Co-founder, Census:
I mean, again, it's an issue of, it's not just size of company, it's kind of scale of impact. So it's like any code change at a large enough company. And by large, I don't mean number of employees, I mean, number of users, means that every second order effect becomes a big deal. And so editing the color of the shopping cart, eventually you're going to be like, well, let's just create the, "Have you thought about this? Have you thought about this?" It's the sad reality of companies as they grow, is you start to build rules and systems to allow for, I hate to say this, the average person to be able to perform well, rather than for the superstar to perform well. And I think the unique joy of a small organization of a startup is a high performance small team, is that you don't have to optimize for that. Just optimize for high performance and don't try to protect yourself from the kind of middle.

David Corea, Sr. Analytics Manager, CoderPad:
And I think the, just to go on that a little further, the other thing that really gets me excited about the smaller companies, especially is it's okay to make a mistake. Whereas even if changing the button color hurts something, there's always a way to reverse it. The important part is learning. And if you ask me, that's the coolest part about the job. I think that's what I tend to favor, smaller companies, because not so much about, I guess just to keep on my point, I love the fact that mistakes can be okay because it fosters a culture among folks in the data discipline to feel like mistakes are a good thing.

David Corea, Sr. Analytics Manager, CoderPad:
And not to say I promote them, but I believe that leads to great learning and can lead to a cooler experiment that happens next, or better conversation across all levels of an org. So you're right. Larger companies and I've seen it where there's meetings about what stakeholders to bring in that might be impacted by the button change. And there's value to that, especially when the org is large enough and there's enough users that would be impacted. I get that. It's just less fun, at least in my mind.

Boris Jabes, CEO and Co-founder, Census:
I could not agree you more. Freedom to make mistakes. Freedom to experiment is if you lose that, it's a little bit in a minor way, soul crushing. And actually you've made me think of this real great kind of like meta final question here, which is given that you're trying to measure, we talked earlier about some really grandiose KPIs that hopefully you can build at CoderPad someday, which is an interview question good. Really dig into that. I think there's a real, super KPI that all companies in the world would love, that I think is non-existent nebulous, which is what is our rate of learning. Because you're right. If you learn faster, that means you're going to achieve more things. It's basically like, it's almost like evolution 101. And so if you boil down almost all companies, who doesn't want to make sure they're moving and learning? In fact, it's learning is better than moving. It's like one implies the other, theoretically. And so, I mean, unless you're purely researched, but how would we measure this? How could you build a KPI that is the rate of learning of this organization?

David Corea, Sr. Analytics Manager, CoderPad:
I would love to have that answer. Where my mind goes is are you conducting experiments and making changes? And that would be very heavily driven by a collaborative effort with the product managers or product owners. And I think that the fact that it's necessarily a collaborative effort makes it something that could describe what's going on at a macro level, within the company. So rate of learning, gosh, I love that. Why? Because it's not exclusive to one discipline.

Boris Jabes, CEO and Co-founder, Census:
Exactly.

David Corea, Sr. Analytics Manager, CoderPad:
Yeah.

Boris Jabes, CEO and Co-founder, Census:
Yeah. I mean, as a programmer, it's almost like it's the ability to introspect. It's an organizational question. You say, are we running experiments? I think that's a good, I agree with you. I think that's the slant. It would be like, number of experiments run is a great first pass of, okay, that's how I would probably build this KPI. But is performance reviews is like org structure. How many experiments should we run on that? Is that a kind of learning and... Because organizations have to learn in, like you said, in all, it's a cross disciplinary concept. So it's hard. This is a, I don't even know if you can boil it down to one number, but definitely we seem to know, as humans, this organization seems to learn a lot and move fast, become better. And this one does not, which means it's somewhat quantifiable or is there a word for this where it's like, humans are able to go, I can see it when I see it, but I can't measure it?

David Corea, Sr. Analytics Manager, CoderPad:
That is plagued legal professionals for years. I know what it is when I see it.

Boris Jabes, CEO and Co-founder, Census:
Right, right. There's a famous ruling for this. Yeah, yeah, yeah. So I feel like a high learning environment, people kind of seem to know, they're like, yeah, you feel it individually. You feel it as an organization. It tends to be related to a high rate of change, but hopefully not change that is just perturbation. It's changed the two words.

David Corea, Sr. Analytics Manager, CoderPad:
Right, right. It's got to lead to growth in some way. And I mean, investors struggle with this too. Is this moving me in a direction that will give me some kind of return or the folks that are steering the company, from execs to PMs and managers that they're asking the same question of their departments. Are we making moves that are progressing my KPI? So I think you're right. It would be an amalgamation of multiple numbers that describe different motions within it, within an org. But if we could boil it down to one number rate of learning rate of, I would probably say that correlates, if not as the same thing with rate of growth and if you're moving into-

Boris Jabes, CEO and Co-founder, Census:
Sure. I think, yeah, of course, I think rate of growth is ultimately the, it's probably the closest output variable to the rate of learning as an input if you want, but the way I think about it is, let's say you make a bet, like, oh, let's try a billboard. Let's try a super bowl ad. A lot of times you'll see executives be like, let's do that. I'm willing to put the money in. I'm willing to take the bet, or use the word gamble. And what we all want on the other side is okay, a home run would be great, but if not, I would love to have learned what works and what does not so that we are now smarter. So that next time around other people are like, yeah, we'll do a super bowl ad, but we'll know what works and doesn't work. And that's valuable.

Boris Jabes, CEO and Co-founder, Census:
You're trying to capture that and we know how to pose experiments. We kind of know how to test things, but we don't quantify the amount that we have learned. You know what people call this. It's like process knowledge. It's when you try to recreate the semiconductor factories now that we've thrown away in the US decades ago. It's not just the machines. You can go buy the machines. It's that someone who's been building semiconductors in a factory in wherever at Intel or in China for 10, 20 years. There's a process that's in the human's brain and nowhere else.

Boris Jabes, CEO and Co-founder, Census:
And it might be written down. This is one of those things you can pull up the Apollo thing and they wrote it all down. But you would not be able to operate the machine. It's like the combo of what's written down and what's in the osmosis of people who have been doing it. And it's like, you're trying to quantify that. It's like how much institutional knowledge are we generating this organization that is unique and differentiated and that helps us obviously towards our growth. Otherwise it's just knowledge.

David Corea, Sr. Analytics Manager, CoderPad:
Right, well, I think that's what goes through my mind. It's fine you're learning, but is it learning that's helping? And so I can't help, but also relate it to some cost, which is why I think in my mind, I went towards well learning that is helping move a business in a direction that's fundamentally a good direction. And that doesn't have to be quantified by say dollars or users or whatever. But in some direction that helps move the business towards whatever its growth path is. There's got to be some cost factor that's accounted for. Is the learning that or the experiments or whatever it is that we're trying, teaching us stuff that moves us forward. And if the answer is we're learning lots, but we're not going forward, then is that really learning?

Boris Jabes, CEO and Co-founder, Census:
No, you're right. Of course, you're completely correct. And now you are also qualified to be a venture capitalist, David. I think it's the short term/long term. It's like, you're willing to stagnate for, let's say a month or three months if it meant you learned something that would now help you grow faster eventually. But yes. If there's no growth ever coming, then you're right. Then you're just learning things and it's an academic exercise. It's not useful.

David Corea, Sr. Analytics Manager, CoderPad:
That's a great circle back to where we were prior. There's a place for rigor and there's a place for just trying stuff. And one of those gets you towards positive process knowledge.

Boris Jabes, CEO and Co-founder, Census:
Yeah. No, you're totally right. And I think some experiments will cause you to fail or die. And that's also bad. It's like, you want to be right at that edge of the boat keeps moving closer and closer to the destination and faster. And we can zigzag a little bit along the way. Well, listen, David, this went to a lot of different, interesting places. So thank you for a really edifying conversation.

David Corea, Sr. Analytics Manager, CoderPad:
That was a lot of fun. Yeah. Let's do it again.

Boris Jabes, CEO and Co-founder, Census:
Definitely. Definitely, David. Definitely. All right. Well listen, thanks for joining and yeah, we'll do this again for sure.

David Corea, Sr. Analytics Manager, CoderPad:
Fabulous.

Boris Jabes, CEO and Co-founder, Census:
Well folks until next time, this is the Sequel Show. Special, thanks to Joe Stevens for our theme song and thanks to all of you for listening and supporting the show. If you haven't already, subscribe anywhere you listen to podcast to get notified for future episodes.