Dollars to Donuts

Steve Portigal

  • 50 minutes 14 seconds
    43. Leanne Waldal returns

    In this episode of Dollars to Donuts I catch up with Leanne Waldal, five years after she first appeared on the podcast. She’s now a Principal in User Experience at ADP.

    A couple of years ago, I realized I know things. We all know things, but sometimes we go through life thinking there’s always something more for us to know, or we don’t know as much as others. A couple of years ago I was like, oh, I know some stuff. I could share it. If I think of myself at 23, 24 years old, I had people who were my age now who were telling me things that I listened to and got advice from. I’m that person now. I can be the person who like gives people advice or says, I don’t actually know everything, but here’s some things I learned over the years that might help you. It makes me feel good to do that. It boosts my confidence. It helps me feel like I can actually do something that’s not just my craft or not just my job for a paycheck or not just this, but I actually have something to offer. And that’s a great feeling. – Leanne Waldal

    Show Links

    Help other people find Dollars to Donuts by leaving a review on Apple Podcasts.

    Transcript

    Steve Portigal: Welcome to Dollars to Donuts, the podcast where I talk with the people who lead user research in their organization. I’m Steve Portigal. In this episode I catch up with Leanne Waldal, 5 years after she was first on Dollars to Donuts.

    But before we get to that, I updated my classic book, Interviewing Users, to incorporate what I’ve learned in 10 more years of being a researcher and in teaching other people. After the book came out, I spoke about user research with Karen Lynch on the Greenbook podcast. Here’s a short clip.

    Karen Lynch: I mean, there’s even some frameworks for creating a knowledge management platform for yourself, you know, how to have a database of your own research. So excellent applications for a smaller shop that might not have access to platforms and tools. But here’s a way you can kind of create your own hub, knowledge hub. You did a very—a good job, solid job, an important job of also providing different—you know, here are some forms. You gave structure to the field is what you did. You know, here’s some forms that you can look at for, you know, debriefing your interviews after you conduct them. Here are some kind of, like—I don’t want to call them templates. But here’s the framework that you can do for creating your discussion guide. Here’s some tools you can lose—use to synthesize your data. So you were—you’ve given some very tangible tools in this book for anybody who is really trying soup to nuts to go off on their own for the first time or just get to know the field that maybe they’ve been hired in, really practical, tangible things that researchers can borrow from. I mean, again, having been in the field for a long time, some of this was—you know, some of this was a part of my practice. But I’m like, “You know what? That’s a great debrief form. That one really stood out to me.” For example, Interview Debrief Form, where it’s not just take notes on what you just experienced, but it’s prompting your brain to think through what it means. So kudos to you on that. Is that just a practice of yours? Or did something kind of stimulate that thought that you’d might want to include those?

    Steve: Well, you know, there’s this interesting part of research where it’s collaborative and facilitative. I mean, it’s not just what, you know, I, as the researcher, or me, as part of the research team, learns. It’s, you know, the people that we’re working with. And so I have an obligation to them. Boy, that sounds like—it sounds very—more moralistic than I mean. Like, I can do a better job if I can help them learn something and take something away. But, if I also—if I hear what they’re taking away, especially—I’m, you know, I’m not the domain expert. I work as a consultant, so I come into an area that somebody else inhabits. And so they’re going to always see things in the research that I won’t see. It’s really helpful for me to understand what didn’t they hear that person say. Like, if there’s a gap in what they took away, then I now know I need to kind of emphasis that because it’s—there’s a takeaway that’s obvious to me that isn’t to them. So that needs to be surfaced as I share back. So I can get that out a debrief. And, when I hear what they heard and what surprises them, I understand, yeah, how they’re framing the world, what’s relevant information. Like, I’m getting this indirect feedback. So, you know, I—like, it’s not my natural way of being to have a template for an activity. I’d rather just chat. And sometimes that suits me well, and sometimes I need to put a little more structure in it. So, I think, you know, writing up a debriefing guide—well, I think there’s—having something formal like a template or a tool you can use sort of reminds me that this is an important part of the process. I need to make time for it and mental space for it, and I need to tell the people I’m collaborating with that they should leave time for it. And guess what? This is a serious activity. I don’t—I’m not just trying to, like, get coffee with you and, “Hey… what did you think?” I’ve got a document here. So there’s a little bit of a theater. And I don’t mean that in an unkind way. But there’s a little bit of a, you know, a formality to it that reminds me to take it seriously and that shows my collaborators that I value what they have to say and that I, you know, I’ve got some format for that.

    That’s from the Greenbook podcast. I’ll put the whole episode in the show notes. And I hope you’ll check out this updated edition of Interviewing Users, and share it with all of the people in your network. You can help me and help the book by writing even a tiny review on Amazon.

    So, let’s get to my conversation with Leanne. She’s now a Principal in user experience at ADP.

    Well, Leanne, thank you for coming back to Dollars to Donuts after, I guess, five years

    Leanne Waldal: Good to be here.

    Steve: or so. It’s nice to talk to you again. It’s nice to talk to you again for this podcast. That introduction presumes that you and I have not spoken in the intervening five years,

    Leanne: We have.

    Steve: but we have.

    Leanne: Yes, indeed.

    Steve: We have. The secret is out. Let’s talk about some of the things that you’ve been up to professionally in the last five or so years. We can kind of start anywhere and go anywhere, but I think that’s one of the things we’d like to cover and just hear what some of your experiences have been and what you’re thinking about nowadays with user research.

    Leanne: Yeah, and the wonderful thing about the last five years is it was about four years ago that the pandemic started. And when we spoke about five years ago, it was probably six months before this pandemic started and it changed research. So I went and worked for a company and helped them understand using ethnographic research in Germany and France, what was going on with the markets that they were trying to sell into their help teams really understand what they needed to do to better sell into those markets.

    And then I joined an agency. And at the beginning of 2020, I was selling a project to a company with my teammates in the San Francisco office. And we had planned out like many of us in early 2020, all the things that we were going to do in this project over the course of 2020. And then of course, the pandemic happened and we all went home. And so we had to shift the type of research we were doing from going out and talking to people in person and seeing them in person to doing everything over video and then also running some surveys.

    One interesting thing that happened in the summer of 2020 is humanity got tired of answering surveys. So I was doing mixed method research in the summer of 2020 with interviews with people about their experiences. And then I also wanted from those experiences to measure them at scale with a survey to find out, well, if I talk to 20 people and then I take these things and find out, you know, how do a thousand people respond to these experiences? It was a really easy target. It was millennials in California. It took like 35 days to get a thousand responses. And that was fascinating to me because it was really easy for us at that time to get people to talk to us on video. But I talked to the panel provider we were using at that time and they said, people are tired of answering surveys. This pandemic is wearing people out. We have all sorts of personal things we’re dealing with and we’re also terrified the world’s about to end.

    And so then in 2021, before the vaccines came out, we were doing some work with a startup at the agency I was at. And I realized that some of their early like Kickstarter supporters were in the San Francisco area. And I said, this is about motorcycles. Motorcycles are used outside. We could actually meet people in person. Like that’s the thing I miss the most. I’m an extrovert. For all the people who are introverts, the pandemic was just wonderful for them to a certain extent. Like lots of people I know who are introverts were happy to work from home and happy to be at home. Me, I miss people.

    And so I started up ethnographic research again in 2021 by meeting motorcyclists in a park. Got some great pictures of all of us in masks standing about eight feet apart from each other handing out water bottles and shouting across the grass to each other. When I left the agency, I went to work for a startup and introduced again ethnographic research to them because they had a mobile app but hadn’t really watched people use it out in the real world. And so I started just in the San Francisco Bay area to just run a pilot, like let’s go get some people using this and see how they use it. And then got the funding to go visit some customers and visit some consumers and some other people.

    So I was tromping around in the snow in Minnesota in like negative degrees following somebody who was using our app also tromping through the snow, which was super fun. And when that startup laid me off, I did some consulting. I did some consulting for a large company, also doing some ethnographic research, going into some offices of the people who use their products to show the pain points. And also then started what I started doing, which I hadn’t done in a while, which was really fun was looking qualitatively at the pain points that were happening and then going to the people who have numbers from revenue and numbers from usage analytics and figure out how much money is being lost because of the user experience or how much money could be gained from cleaning up some of the user experience.

    And then I joined ADP in June of last year as a principal in user experience. And I’m leading all sorts of research, including recently some ethnographic research again with one of our clients, which is super fun, teaching people, teaching people and teaching people.

    Steve: I think we like to forget things that happened during the pandemic because, as you said, people were worried that everything was going to end. So that slow survey response or slow and low survey response, do you know, has that rebounded in the time since then?

    Leanne: It definitely has. So by the time I was at a startup that I was at, I think we did some ethnographic research, some interview research, and then we had like, you know, six key things that we wanted to understand from a certain consumer population. I think we got all the response we needed in like 11 days. So from what I’ve seen in that sort of consumer research and mapping survey research to qualitative research, we’ve gone back to getting people to respond to it more quickly. But if you remember the summer of 2020, we were terrified. Why would you even if you’re offered a gift card to answer a 10 question survey, like spend time on that when you’re trying to take care of your kid, dogs and yourself?

    Steve: We were terrified and we also felt, I think, again, it’s hard to remember with any clarity, but there was this sense of trying to preserve some aspects of normalcy, like family and work and so on. But you could see why people would want to run a survey because that’s part of their job and they want to continue feeling normal, but it’s a really, it’s an interesting observation that people didn’t want to respond to surveys because of what everything that was going on.

    Leanne: Yeah. And as researchers, we lost some of our ability to understand people’s experiences because of the pandemic and being locked down. So we were limited to diary studies that are about things like surveys and interviews.

    Steve: I remember having a lot of, we had a lot of complicated feelings about a lot of things, but watching that period where research was starting to pivot to remote and I don’t know, I felt like there was a certain gleefulness in being able to guide that. I don’t want to make it like a Coke versus Pepsi thing, but it seems like there is a remote versus in-person belief system or, you know, adherence and that when in-person became impossible, the folks that could give guidance and best practices on remote, it was hard to not see that guidance being offered because it was everywhere.

    Leanne: Oh, of course. It was amazing at that time for people who didn’t know how to do remote. Like the agency I was at prided itself on doing contextual inquiry all the time. So teaching people how to do things more remote, I always like to mix things up and tell people like you can’t do all remote or all contextual or all quant. You need to mix it up to understand a human experience.

    Steve: But that doesn’t work with our desire to make a single declarative, this is the best way to do it, Leanne, what do you mean, what do you mean mix it up?

    Leanne: Oh, we all combine complexities in our experiences and how we do things. I find it really hard to stick just in one camp.

    Steve: And I think it’s interesting, you talk about being an extrovert kind of driving you to, if I understood, yes, extroverts suffered from lockdown, but that as a researcher who’s also extroverted, you really were, I think, creative in trying to find ways to do some portion of context and kind of get yourself out there and get to where people were.

    Leanne: Yeah, I felt really lucky to be working for this agency in San Francisco where we had an office where the windows open. So we were coming back to fully masked with all the windows open, standing apart from each other just to see each other. Do something on a whiteboard together, sit in a room and talk together just to, because you can’t just like one thing about the difference between desk research remotely and seeing someone in person is you can’t see the Post-it notes around the monitor. We all got Post-it notes around our monitor. We all got things, you know, I’ve got stuff here today to remind us of what to do.

    And if you have a workplace, you’ve also decorated your space. So, you know, we’re decorator crabs, basically. We bring in pictures of our family and someone brings us back, you know, a token from some trip and we put it on our desk. And you know, the team all goes out and does something fun together and we put a picture on our bulletin board and you can’t see that. And recently something interesting I noticed was that, which I hadn’t thought of before is that with this remote, like you and I just look at each other, you know, on video, if I shared my screen with you, you can only see one screen at a time. Well, a lot of people work with two or three screens now. So how do you in a remote world see how they’re comparing things across two screens? Like they could, I could share one screen with you and then another, but you couldn’t actually see my experience of, you know, comparing this Word doc with this Word doc. So that fascinated me. I was like, Oh, amazing. The world has also changed that monitors are cheap and we can have multiple now.

    Steve: I would have to ask whether or not you had multiple monitors and ask to see them.

    Leanne: Remotely, you might just assume your screen is your screen. Like think about if you’re doing research with engineers, you know, you and I are both of a certain age. So you remember when we only had one monitor and it was like a big thing, the two of them on a desk, there’s no way you could have two monitors. Well, now if you go look at any software developer who’s working, they’ve got a couple of monitors or three monitors. They’ve got curved monitors. You know, we just sort of surround ourselves with all this stuff to look at. And you can only see that if you go see people in person, we don’t yet have a way with any of zoom or any other video technology, be able to see across all of those monitors all at once. So I think that’s an interesting point of innovation for these video companies.

    Steve: And we’re still talking about research where the thing we would be looking at would be something that takes place on a screen.

    Leanne: So I want to be able to see what you’re doing. Also, you know, motorcyclists can’t see their motorcycle. If I talk to you on video, maybe you can bring like your phone or something out to show me your motorcycle, but I can’t get that giddy experience you have about talking about it person.

    Steve: Sorry, did you say the giddy experience?

    Leanne: Yeah!

    Steve: Who’s giddy in that example?

    Leanne: The motorcyclists. Motorcyclists feel very, very strongly about their motorcycle. And I also noticed, because I’d been doing in the consulting gig I did before where I’m working now, I was doing some remote interviews. And then where I’m working now, I do remote interviews and go out and talk with people. And I’m noticing this distinct difference between how vulnerable someone will be with me and how much they’ll share with me in person versus on a screen. And that, if you think about it, just makes human sense. We can sense each other better and share things better when we’re in person than when we’re on a screen.

    Steve: When you and I did the previous episode of this podcast, I came to your office and we sat in a room holding microphones to do it.

    Leanne: Yeah, exactly. Yeah.

    Steve: Technically, I guess we could have done that again, but I think we rely on this technology. We’re using remote screen-based recording, blah, blah, blah, that has become the default, even when it’s not the only option.

    Leanne: Yeah, and I think it’s fine. This gives us opportunities to talk to people we can never talk to. But I think we have to make sure we remember the importance of doing things in person. Humans are people who sense things off of each other, and it’s important to make sure that’s there in the mix with everything we’re doing.

    Steve: And you set up off the top to teaching and you talked about going with your colleagues

    Leanne: Mm-hmm.

    Steve: and teammates and clients and so on into the field. We have a number of different parties that have a different experience.

    Leanne: Mm-hmm. Mm-hmm.

    Steve: So the motorcyclist who’s giddy, us the researcher, and I mean, I’ll just say like,

    Leanne: Mm-hmm.

    Steve: I need to be in the room with that giddy person. So I need something that I’m missing from the remote. But then there’s also the people that you’re taking out that you want to give them a sense

    Leanne: Mm-hmm.

    Steve: of their customers and their users. What’s been your experience over these few years in trying to do that for your clients and stakeholders?

    Leanne: Well, for that sort of work, bringing marketers, salespeople, engineers, product managers, designers, et cetera, out into the field for consumers or customers, B2B or B2C or B2B2C, it never fails to surprise them. I’ve been doing this for a long time, and it’s the same thing over and over. And I love that sort of teaching moment of bringing someone out to an experience and just saying, I’m going to run the camera. If you don’t know what to say, I have no problem keeping the conversation going and saying, tell me more, tell me more, show me more. What about that? You get to just observe, or you can participate. It’s all up to you. There’s no rules, really. And I’ve never had someone say that they didn’t or weren’t surprised by something that they saw or they heard or observed. And usually, the things people are most surprised by are the environments that people are in. Because if you only see me in this room, like right now, I’m in a phone booth in a WeWork. I probably look like I’m in a sauna. You have no idea what’s going on around me. And so to see the vast experience of someone’s house or of someone’s business or someone’s office or following them along while they do something out in the world, usually people I’m with are as surprised by that as they are by the things in the product or the service or whatever it is that trying to understand the experience of. Well, for B2B, when I’ve worked in B2B,

    Steve: What is it that these folks are learning from these experiences?

    Leanne: I’m trying to help people understand the humanity of the people who are using the product. That it’s not just someone using a credit card to pay for a subscription, or it’s not just the buyer and their team who are at work and using your product in their work. It’s not just a job. There’s a human behind all that who’s got needs and has got friends at work or has colleagues or people that influence or don’t, or they have other products they’re using around your products that you can’t see from your usage analytics. So I’m trying to get them to see the giddiness of the motorcyclist, not just like, “I am this demographic and I’ve got this much money to buy this kind of product,” but look at how they feel about it. That’s what brand campaigns and marketing campaigns do. They get into our feelings. And so I try to get the product side of a company to understand that piece, that you can get more engagement in your product.

    You can get people to use it more if you get them to feel good about it. It’s like Kathy Sierra’s book, “Make Your Users Badass,” or something. I probably just mangled that. Something like that. You want your users and your customers to talk about you at a cocktail party after two drinks. You want them to remember you so well. And then that Maya Angelou quote about, “I don’t remember what you said or did. I remember how I felt.” That’s the human. And so in companies, marketers know that. Brand people know that. I try to help product and design people remember that because they know that too. We just don’t remember that when we’re focused so much on usage analytics. It’s like, yeah, usage analytics and revenue analytics are important, but they aren’t important in a vacuum. You have to also make people feel good and feel smart when they’re using your product. Yeah.

    Steve: Have you seen any shift in the appetite for the kind of, you know, understanding that you’re enabling people to gain?

    Leanne: Research and design appetite has gone up and down in the tech industry over the past 20, 30 years that I’ve been in it. Right now, there’s a lot of people out there saying nobody has appetite for us. I’m like, well, maybe they have an appetite for a certain type of work that we do. We don’t have an appetite for all the work that we do. And that appetite ebbs and flows, just like the appetite for anything else. It’s just humans and trends and businesses and business decisions. And my advice to everybody has always been like, you can create an appetite for what you provide. It’s a sales technique. Understand what somebody needs, meet them where they are. Instead of being like, here are the five things I do. Here’s the checklist, sort of like McDonald’s, like here are the things you can order from me. Oh, you don’t want any of these? And instead say, what are you looking for? What are you trying to achieve? What are you doing? How can I help?

    Steve: I want to clarify, or I’m trying to think about this as a question and not a statement, but I might just go with a statement. There’s a difference between, you know, when you say, find out what people need and then help them achieve it, that doesn’t mean if they don’t ask for contextual research, you don’t do contextual research. And maybe this is where the McDonald’s thing breaks down for me, or needs like another metaphor layer on top of the metaphor. Because if you ask what they need, no one needs research, they need the information or the decision or whatever they’re going to do about it. And so you have a lot of ways to get to that outcome. But if you say to them, hey, I do research, to your point, right, if you say, hey, I do research, do you want research? No, I don’t.

    Leanne: Yeah, exactly. Or also like waiting for someone to ask you for a certain kind of research. When someone who doesn’t know the breadth and the depth of what you can do with research, asks you for something, they might not know what they need or want. They’re only asking based on their own knowledge. So I always find it’s better to figure out what kind of research to do or how to prioritize it or where to be by hanging out with people and finding out, say, what are you working on? What’s coming up? What do we know? What do we not know? Here’s these analytics. Do we know why these analytics look this way? Well, we could find that out this way, or we could do this this way, and see which one of those things that I start proposing to them based on what I heard them say they’re working on or what they’re trying to achieve, sort of like sparks a conversation. And then research is desired and wanted and valued and invited to the table. But I see a lot of researchers, and I coach researchers on this, not going in with like, I’m going to run a survey on this, or we’re going to do some interviews, or I’m going to do this unmoderated. Nobody knows what unmoderated research is outside of the research community. So stay away from the methodology. Stay away from like, I’m going to do this and stick with like, I’m your partner and I want to help you out

    Steve: The labels that are always anchored in my mind is the difference between proactive and reactive. And you’re talking about being proactive. And even I get a picture as you kind of talk that there’s a relationship and there’s something over time. And I kind of hang out with them and see what they’re thinking about and what they’re talking about. That is not a — this is sales, it’s not a sales call, it’s a relationship-based sale.

    Leanne: Yeah. And that’s why, as you and I have talked before, I don’t want to
    go back to run a consulting firm. I don’t want to be a consultant unless I have to, for, you know, to get revenue in my life. I enjoy being a part of a team to like, nurture relationships and get collaborations going and work with people so we can do something together, because I don’t like being a lone wolf. I don’t like being a single point of failure. And, you know, I want to do things together.

    Steve: In some of the conversations we’ve had where you’ve said — made these comments before about, you know, the relationships, I get the sense, and you can correct me here, those relationships aren’t handed to you. You’re seeking them out.

    Leanne: The startup I last worked for, after I got laid off, customers of that startup were still texting me up to like, three or four months after I left the company, because they didn’t know I’d left the company. Every time they text me, I’d have to let them know. But I’d gotten such a good relationship with them that was beyond just like, I want to know how you use this product. It was like, I want to know how to use this product. And how did you start this business? And, and tell me about your family and this little town you live in. And where’s the best place to get lunch? And, you know, I would get invited over for dinner with their families. And you want that sort of relationship, which is very much like a sales tactic, because that’s what people who sell want. They want to get under your skin a little bit to um, understand you better, because that helps them sell to something that you need. And that’s all about relationships and collaboration.

    Steve: What things, if any, are different in the sales relationships and collaboration that you want to nurture with users, customers, versus the colleagues, the people that you work with that you’re also wanting to do collaboration with?

    Leanne: Well, you have a different goal when you’re a salesperson. You’re trying to hit a number. You’re trying to get someone to buy something. My goal with anybody is, I want to help you out. I want to see if there’s something you need that I can help you with. I want to see if, you know, you know, the types of skills that I have and the types of things I can do and the craft I have can help you. And if it can’t, I’m not going to, I’m not going to waste my time, you know, doing something if it’s not going to help you. I don’t know that all researchers or research teams see themselves as a service organization, but we basically are a service organization. We’re providing a service to and with people who hopefully will use and are interested in using the insights and the learnings they can get with us, from us, to build product, to design things, to market something, to sell something.

    Steve: Hearing you talk is really refreshing for me because I think it’s easy for us, or for me at least personally, to get sucked into all of this as adversarial. I don’t know, we hear a lot of stories from each other, you know, in this work, we tell a lot of adversarial stories, persuading, getting permission, convincing.

    Leanne: Do you pay attention to the subreddits on research? Because it’s full of that. Yeah. I just watch it and read it.

    Steve: And sales, I mean, being a consultant, you know, I have lots of peers where we talk about sales, and the more — even though I’ve been doing this for a long time, I still keep learning and relearning that sales is not a persuasion adversarial kind of work. It is a — it’s all the things that you’re talking about, it’s relationships and how can I be of value to you? And that way of being is — I think you live that way, and so you work that way.

    Leanne: Well, I learned it from running the consulting company. And then when I was at Dropbox and they were creating a sales team, I sort of like did a swap with the sales team. I was like, you teach me how to sell things because I actually want to know this more. Like I see, I see value in knowing these techniques. And, you know, you teach me that and give me access to people you’re selling to, and I’ll teach you about them. And that was a really valuable collaboration then. Because Dropbox was just getting into the enterprise. We were just starting to sell to larger customers. And I was working with all of the new enterprise account managers to basically learn what they did, but then also learn who they were selling to, and then feed that back to them to say, you know, I think you could sell this customer this, or this customer needs this.

    Steve: What kind of things did you learn for yourself about sales from that?

    Leanne: Well, that was when sort of like a chandelier went off. And I realized that like techniques we use in research to understand someone’s experience are the exact same thing sales people do. So a lot of researchers will be like, nobody does research like we do. Product managers don’t do it. They ask leading questions. Sales people are just trying to sell. And I’m like, hmm, I’m starting to see that. And this was back then. So about 10 years ago when I was like, oh, salespeople are actually doing a lot of research and product managers are actually doing a lot of research. Why does it matter how you ask the question? If what you’re trying to do is discover something or understand something, as long as you get to the end goal, the path you got there doesn’t really matter.

    Steve: And did you learn anything about — again, you’re using sales in this very elevated way to describe how you work with colleagues, for example. Is this the point at which you started to develop those skills further?

    Leanne: Yeah, that was the job where I grew up. I had run my own consulting firm for 17 years. So from when I was in my mid to late 20s until I was in my early 40s. And when I took my first job after that long consulting period at Dropbox, I was like, oh, I’m going to learn how to be an adult inside of a corporation now. It doesn’t matter what age you are when you learn how to do that. So it went through that learning curve there.

    Steve: The idea of growing up is really — that’s a big one.

    Leanne: Well, I don’t think we all ever grow up. We just grow up in certain ways. It’s like, oh, now I know how to behave professionally. I didn’t know how to behave professionally before.

    Steve: I mean, you ran a consultancy for 17 years, so you at least were able to survive in that time.

    Leanne: But you know this. It’s a different sort of professional to service clients and manage clients and sell things than to learn how to behave and how to manage the politics and the relationships inside of a corporation. There’s this really great book called Orbiting the Giant Hairball, which I go back to every once in a while, because it’s just all about the humanity in these large corporations or in these mid-sized startups and how do people get along and work together and make decisions and collaborate and get things done. It’s very different from running your own thing.

    Steve: You mentioned just offhand, I think, that you talk to researchers and people are reaching out to you for advice, and I’d just love to hear kind of what that’s like, and I don’t know what’s coming up for people right now that you’re talking to them about.

    Leanne: So what’s coming up is questions about how to, particularly for researchers early in their career who maybe started doing research in 2020, they’ve never done contextual inquiry. They’ve never done ethnographic research. They’re working at a company where they don’t have a manager or a leader who’s encouraged them to do that. They started remote. A lot of these people went to college remote, graduated in 2020 or 2021, got a job as an early career researcher, and they’ve just never done this. And so I basically start with the basics, like here’s how you observe someone. The other thing I noticed, and I also talked with some people recently about what they’re seeing of people who are earlier in their career who went to school during the pandemic and now are coming out, that it’s the same thing that those of us who are older went through in the pandemic.

    Leanne: We lost a certain amount of social ability. And most of us, or a lot of us, got that back in the

    Leanne: last year or two. We started meeting people in person again. We started going to dinner parties. We started going out to bars, went to concerts. Some of us went back to offices and figured out how to be in an office again. But there’s a lot of fear of, “I don’t know how to do this, and I’m scared to do this, and I’ve never done this before,” that a lot of it I attribute to the pandemic. And I think those of us who do mentoring and coaching need to be aware of that and teach people how to do that. I know someone who is teaching people how to dress to go to an office, how to wash your hair, how to have conversations with people. And that’s something that’s really specific to this point in time that didn’t exist before. So say, like, 2016, what was I mentoring and coaching people about? It was more relationship coaching. Like, how do you get along with a product manager who disagrees with your research results? Or how do you have influence? Or how do you learn a new skill? Or you’re a junior researcher, you want to become a senior researcher, what do you need to do to show that so that you can get that promotion? I’m not hearing that so much anymore. I’m hearing more around, well, one, how do I get a job? But then also, sort of like, how do I do these things that nobody ever taught me to do and I never had a chance to do before, and I want to try to do them now? Some of these people I’m coaching are the only researcher in a company. So they don’t even have someone to sort of like manage them or say, this is how we did it in the past, 2019 and earlier. So yeah, I just I like to help people out that way. I sort of feel like the things I can do right now are improve relationships among different teams. And I can also help out people who are trying to grow up in their career.

    Steve: What else is coming up in these mentorship conversations you’re having?

    Leanne: Presentation skills. So a lot of people come to me and say, I’ve been giving presentations, you know, over Zoom for years, and now I’m being asked to give a presentation in person. And I thought, how do you not know how to give a presentation in person? I’m like, oh, you’ve never worked in an office where you had to stand up in a room in front of a table of eight people with a pointer, you know, remember how we used to plug like a USB thing into our laptops, and then that we’d have a clicker and then presentation would show up on screen? Well, people are being asked to do that again. But that’s another soft skill that nobody’s taught them how to do. And one thing that is an advantage of presenting over video is you can have notes. You know, nobody can see that you’ve got your PowerPoint and like presenter view with your notes, or nobody can see that you’ve got like a notebook, you know, with like, oh, this is what I say on this slide. When you’re in a room, you’re on stage. And I think a lot, a lot, a lot of us forgot how to be on stage, who used to be on stage. But some people have never been on stage. And now they’re being asked to be on stage. And they didn’t get that practice at college, because you always presented your papers and everything over video or in a small classroom. I’m really glad that my daughter started college in person and is in college in person. So she’s getting that like those that like social growth that you need before you start to turn into a young adult trying to get yourself into the workforce.

    Steve: It’s a fascinating observation that maybe is obvious to everyone, but I had never really thought of this. There’s a significant cohort of people in the workforce who have never, who don’t have a pre-pandemic norm to return to around research, presenting, business travel, any of those kinds of things.

    Leanne: Yeah. Just like professional skills, how to get along with people. Yeah.

    Steve: It’s interesting that at least the people that you’re in contact with have some awareness that they have a gap. Seems like that would be the first step to addressing it is knowing that it’s missing, that there’s a thing that is expected of you.

    Leanne: Yeah. Yeah, there are a couple people I mentor who come to me in panics, like, I have to do this, and I’ve never done it before. And I’m like, oh, it’ll be okay.

    Steve: Yeah, what do you tell someone who hasn’t done a business trip before, who hasn’t worked in an office? What’s the granularity of advice here?

    Leanne: That even me in my early 50s, I still have new things I have to do every once in a while. And we can all do new things and hard things, and you will actually be okay. And then we can get down to brass tacks and go through the tactics. What, how do you pack? Or how do you like memorize things or practice? A lot of people who give presentations over video don’t practice first, because you’ve got all these supports around you to do it. You’re wearing your sweatpants and you put on a button down shirt, but you’ve got sweatpants on. You’ve got all your Post-it notes nearby of what to say. And so without all of those supports, what are you going to do? Well, you have to practice more. You have to think ahead. You have to plan, make lists if that’s your thing. And I think it’s really surprised people that they need to do that.

    Steve: What’s the guidance, this is decontextualized, I guess, but what’s the guidance for a person who’s new to in-person research?

    Leanne:Well, I say, you know, go find someone to bring
    with you who’s done it before. So you aren’t, you and everybody else isn’t brand new to it all. It’s this, you know, it’s something like, you don’t all want to be new to doing in-person research on the group that’s going out to do it. See if you can find someone who’s done it before. Or find someone from marketing sales, who’s done something similar. Who’s at least like gone to visit customers or has gone to conferences and meet with customers. Your first pancake of the first time you do anything will always be a little rough. So it’s always good to have someone there who, you know, can advise you a little, or, you know, steer you in a slightly different direction when they see you going astray. And if you don’t have that, then, you know, just be gentle with yourself. Whatever you’re doing is the best you could possibly do.

    Steve: That’s the lesson for everything, right? Let’s just be gentle for yourself. What do you get out of mentorship?

    Leanne: Well, a couple of years ago, I realized I know things. And, you know, we all know things, but sometimes we go through life thinking there’s always something more for us to know, or we don’t know as much as others. And it was a couple of years ago when I was like, oh, I know some stuff. I could share it. You know, maybe it would provide some value to people who, you know, like if I think of myself at 23, 24 years old, I had people who were my age now who were telling me things that I listened to and got advice from. And it just sort of popped into my head. I was like, oh, I’m that person now. I can be the person who like gives people advice or says like, you know, I don’t actually know everything, but here’s some things I learned over the years that might help you. So, and it, it makes me feel good to do that. It boosts my confidence. It helps me feel like, oh, I can actually do something that’s not just my craft or not just my job for a paycheck or not just this, but like I actually have something to offer. And that’s a great feeling. Yeah. And then you’re sort of surprised, like, oh, I actually know how to do this.

    Steve: I don’t know if this happens for you this way, but sometimes I don’t know what I know until I’m in a situation where I’m asked to help somebody out.

    Leanne: Yeah. And other people will say to me, like, well, Leanne, we see you as an expert and everything. I’m like, yeah, but I don’t see myself that way. Yeah. Yeah. And for me, it meant I was taking it on in a way that felt comfortable for me. Like I don’t need to be or want to be the expert who, you know, gets on stage everywhere or has the big title or anything like that.

    Steve: But to choose to be a mentor is to partially take on the role of the expert. The theme of some of your earlier points about creating good collaborations around research are understanding what somebody needs and how you can be helpful to them, and that you like to be helpful. And that seems to manifest in mentorship as well.

    Leanne: But I do like to help people. Yeah. And that’s that being a helpful person is something that’s often attributed more to women than to men. And I used to when I was younger, sort of like sort of like push away things that were like, oh, that’s stereotypically female. And I’m like, OK, so what if I’m doing something that’s stereotypically female? I like it. It makes me feel good. Makes me feel strong. People appreciate it. And so I’m going to own it. But there are those of us who are Gen X feminists who grew up in a time when you had to sort of reject things that were stereotypically female. So I’m starting to embrace more of it now.

    Steve: Is that Gen X at 25 and Gen X at 50 are approaching life differently?

    Leanne: Yes, I was bald at 25 with an attitude.

    Steve: There’s so many good follow-up questions to the statement, I was bald at 25.

    Leanne: I’d come out of the closet probably like, I don’t know, three years before, two years before, something like that, and declared myself to the world as a dyke. So I shaved my head and I wore everything rainbows. And yeah, it was fun. I look at that part of myself and I’m like, oh, that was awesome. I don’t need to do that now.

    Steve: The part of you that we’re talking about that likes to help, how did that part of you manifest when you shaved your head and wore rainbows?

    Leanne: Oh, it didn’t manifest itself at all then. I was angry at the world. Yeah. Yeah. It was not an easy world to come out in 25, 30 years ago. I was, let’s see, 25.

    Steve: When did you start your consultancy?

    Leanne: I was working for AT&T at that time. AT&T Wireless, because I’d worked for a startup cell phone company. I was in Seattle. I moved to San Francisco in 1996, and that improved my life immensely. So I worked for a little startup called Organic Online in 1996. It’s now a huge company, but at that time it was like 30 people. Left there and worked for a startup that was started by Howard Rheingold, if anybody remembers him. And they ran out of money, so it got laid off. And then I was doing, people were asking me to do contract work for them. And at this time I wasn’t doing research at all. I was doing QA and server performance load testing. And so I was doing contract work and someone said, oh, you’re getting so much contract work, you could start a company. So I started a consulting company in the fall of 1997 at the age of 26, where I was very brazen and thought I could do anything. And yeah, and sold, just sold projects to all sorts of startups and tech companies. 1997, ’98, ’99, the money was falling off trees and started selling research when a company had asked if we could do it and I didn’t know what it was. And so I went and asked some friends and someone told me and I was like, oh, well, let’s figure this out. In the late ’90s in the web and tech industry, if you just basically said you knew how to do something, someone would pay you to do it and then you just found other people and figured out how to do it. Sort of fake it till you make it.

    Steve: Is there a point professionally where the seeds of what you’re talking about now, because I think you’re describing a way of being, that’s about finding out how to help people and doing that. Can you identify some of the seeds of that? When you were angry and wearing rainbows, that was not present there, but where did it start to emerge and how you worked?

    Leanne: Oh, yeah. When the NASDAQ crashed. So do you remember the NASDAQ? NASDAQ crashed in the spring of 2000, which was actually a bigger deal for those of us in the San Francisco Bay Area than the terrorist attack in 2001, because it had a bigger effect on our life here on the West Coast. I had to lay off a considerable number of people at my startup company, at my consulting company. And I had never done that before. I didn’t know how to do that. I was a terrible manager. I was like running. People who worked for me then thought I was great, but I look back at that and I’m like, oh, compared to me now and having gone through management training and everything, I didn’t know what I was doing. And I wished at that time that I could be able to do more for people who had relied on me for a source of income to pay their rent and pay their bills to support them better when we just lost tons of clients and revenue streams all at once after the NASDAQ crashed. Yeah, probably started then because then the company turned mostly into a research company. We still did QA probably until about 2010 or so, but we were primarily doing research projects and I was primarily hiring people who were in research.

    And I also had got married and had a kid and it changes your life. So in 2004 was when Gavin Newsom made marriage legal for 45 days in San Francisco and my wife was pregnant. And so we got married when she was pregnant, told her dad he should bring a shotgun to city hall because it was a shotgun wedding, bought a house, got a letter from the California Supreme Court saying our marriage wasn’t valid so we had to run down and get domestically partnered before the kid was born. And for all of us, it’s usually things like that that happened during our life that help us gain a little more empathy for experiences in other people. It’s just the act of being human and growing older and having experiences that makes you understand other people’s experiences. And oh, everybody holds pain and everybody holds things that they won’t tell you about and we’re all here to help each other.

    Steve: And just hearing you talk about it, even thinking like the layoffs, how we handle endings, I’m reminded of something that you and I talked about, not on this podcast, but just years ago. And I was catching up with you during a period of time where, I guess you were leaving a job and you were thinking about how you wanted to leave everything. Would you mind kind of describing some of what you did and how you thought about it?

    Leanne: Sure. I had been laid off because a company was reorganizing and the organization that I was overseeing no longer existed. So there was actually no place for my role anymore. And I was given time. So it wasn’t like the startup that laid me off a little over a year ago where it was just like, you’re done today, turn your laptop in. It was like, here’s the package and you’ve got this many days and you can still come into the office but you don’t have to do work anymore. Being told that you don’t have to do anything anymore, I was like, lots of people depend on me here. I have lots of relationships I need to pass on to someone else. So the initial sort of angry or hurt response to being laid off, I was like, okay, I can have this angry or hurt response because I love this job. But I also understand the business decision behind it. And I also understand that there are people who will need things from me before I leave. So one of my colleagues actually said when I met with them on the last day and said, here’s the budget for the team and here’s the things, I documented all the things that we were doing. Here’s all the people that we had relationships with. I already sent out email intros and made sure everyone had them. What they said to me was like, they’d never seen someone leave that way before. And I said, well, why would you want to burn bridges?

    The tech community is really small. And just like when I’ve been harassed or mistreated at a company, I haven’t been the person to want to sue the company because this tech community is so small and I don’t want to be that person. So I want to take other ways to file a complaint or make something known or make sure that things are documented. And so it’s the same thing with leaving, whether it’s my choice to leave, when I’ve made a choice to leave a company, I’ve done the same thing. I make sure everything’s documented. Everybody has a relationship pass off and everyone’s going to be taken care of. So like when I’ve left companies, you know, and given like 60 days notice, I’ve made sure that whoever is reporting to me, they knew who they’re going to report to next. We had like closed off our relationship. You know, I told them how to get in touch with me outside of the company. You know, if they want any coaching or mentoring in the future, because as long as I’m working, I want to make sure I maintain good relationships with everybody. I think that’s the most important part of work, is relationships and colleagues.

    Steve: And I think when we talked about it, you said, these people, you may manage them in another job, or they may manage you in another job, or they may interview you for another job, or you may interview them for another job.

    Leanne: Yeah, exactly. Yeah. I mean, I have had people who worked for me in the past who will just say misbehaved and then applied for a job on my team, you know, at another company I was at. But just like that thing, like, I don’t remember what you said or what you did. I remember how you made me feel. I look at their resume in a stack and I’m like, no, like you did something that really harmed my agency or you did something that was really professionally inappropriate. You know, I can’t have you working for me anymore. That company that laid me off, the next company I was at, I was working with the sales team and they wanted to sell to that company that had laid me off. And I said, sure, I know people there. I can, I can help set up something. And we, so we set up this whole lunch. We were going to lunch at a restaurant in San Francisco and waiting in line to be seated was the person who had like, laid me off. And, you know, we gave each other a hug and I was like, you know, we’re okay. Like there doesn’t need to be an adversarial relationship when there’s business decisions made and when we all live in the same town and we’re all in the same tech community.

    But people do set things up that way sometimes. And I think the harm it does is to yourself. It’s sort of like, I tell people, you have a choice of what you’re going to do with the feelings you have right now. You’re feeling frustrated. You’re feeling overwhelmed. You’re feeling disrespected, undervalued, whatever. Okay. Those are feelings. What are you going to do about it? Like you have control over your actions and your next steps and what you say and what you do. This is no longer about research. This is about being a human in a workplace.

    Steve: Yeah, which is the core of research.

    Leanne: Yeah. Well, and it’s also how to show value in research is to be that human who’s professional and can manage situations and keep a certain sort of like emotional regularity.

    Steve: You know, the successful researcher or leader or manager who does a great transition when they’re laid off is also the person that is good at understanding what people’s needs are and proposing ways to help them accomplish it so that they collaborate and research together. It is the same set of values and life skills that you’re talking about. I mean, I think we went someplace really interesting and I like what you said, you know, these are human skills and I do think this is about research even though it’s about being a human.

    Leanne: Well, that’s what I coach people who are looking for a job is to focus more on like, what is the hiring, like use your research skills. What is the hiring manager looking for? Does that fit you? Because there’s all sorts of researchers out there right now who are hurting because they can’t find work. I think the way that we find value again and find ourselves jobs again is to use those skills we have as researchers to understand what people want.

    Steve: Is there anything in today’s conversation that we didn’t get to you bring up?

    Leanne: No, that’s been great, Steve. I like the way it just sort of wandered. This has been fun.

    Steve: Thanks Leanne for being just so wide reaching in what you have to share and kind of digging into a lot of related aspects. It’s really very interesting and inspirational for me personally to have this conversation with you. Thank you.

    Leanne: Oh, you’re welcome. And thank you, Steve. Always a joy to talk with you.

    Steve: Okay donut friends, thanks a whole heap for listening to this episode! Don’t forget, you can always find Dollars to Donuts where all the podcasts are, or visit Portigal dot com slash podcast for all of the episodes with show notes and transcripts. Our theme music is by Bruce Todd.

    The post 43. Leanne Waldal returns first appeared on Portigal Consulting.
    19 April 2024, 4:15 pm
  • 1 hour 2 minutes
    42. Celeste Ridlen of Robinhood

    For this episode of Dollars to Donuts I had a wonderful conversation with Celeste Ridlen, the Head of Research at Robinhood

    This is a fundamental leadership-y thing where no two people are going to do that same leadership role the same way. You’re never going to do them the same way as somebody else. And that’s actually a really good thing because the situation may call for exactly what you can offer. But because of that, if you’re looking to other people to decide whether or not you’re going to be suited to doing that role, it’s kind of like thinking about whether or not you should be a writer based on whether or not you can write exactly like Mary Shelley. – Celeste Ridlen

    Show Links

    Help other people find Dollars to Donuts by leaving a review on Apple Podcasts.

    Transcript

    Steve Portigal: Welcome to Dollars to Donuts, the podcast where I talk with the people who lead user research in their organization. I’m Steve Portigal.

    Did you know there’s a new, significantly updated second edition of Interviewing Users? Of course you did. I hope you’ve checked it out and are recommending it to everyone you know. Shortly after the book came out, I had a conversation with Adrian Brady-Cesana for his CX Chronicles podcast. The link to the whole episode is in the show notes, but here’s a quick excerpt where we talk about domain expertise in user research.

    Adrian Brady-Cesana: I’d love for you to share a couple examples or a couple stories of some of the things that you’ve seen working with your clients and working in your business around sort of how you’ve seen some of the companies that had really incredible teams or some of the commonalities or some of the things that you saw again and again and again with your clients that really had a solid handle on how they sort of built up their team, built out the different roles and really kind of stratified how their team was going to be taking care of their customers.

    Steve: I think there’s a lot of pressure on people doing research right now to carry it yourself all the way through. And I think this is such collaborative work. And I just, I think I’ve seen more success when there is some collaboration and that’s a big, collaboration is a big, big term.

    But one thing that, you know, your question makes me think of is complexity. Like I think as user research as a practice has grown, it’s finding its way into many more complex domains like installing and maintaining and configuring servers and network devices, not even just servers, but the whole, the whole infrastructure. I worked years ago on a credit default swap trading, and you might or might not have heard that phrase, but boy, just dig and dig and dig. And it’s like, it doesn’t make any sense until you’ve really been involved.

    And, you know, so for me as a consultant, but even my clients who are on teams, like they’re, they’re not necessarily domain experts. And so I think this really interesting challenge comes up for whether you’re a researcher, whether you’re someone else in the organization that’s out talking to customers is, is trying to navigate that balance between like, how much do I need to understand about this?

    And so for me, I think one thing I’ve seen that to be really successful, it goes back to the collaboration thing is when you pair up someone who’s great at research, which is, okay, I don’t know about this. I want you to explain it to me. And someone who is great at the domain, whose job isn’t to ask questions. Their job is to hear what doesn’t make sense about the technology or about the deployment or about the process. And that collaboration is really, really sharp, I think, and has a great effect on, you know, when you’re talking to customers and users, I think sometimes we’re nervous because while we want to be seen as credible, especially if it’s an actual customer, right? We ask for their time. We want to go talk to them. You’re going to send some idiot that doesn’t know what they’re talking about. That isn’t necessarily the reaction you’ll get, but I think it’s sometimes the reaction that we fear.

    And so it can be really a really great triangle between a user or a customer who has, who’s a practitioner of something very complex, you know, and a person from the producer or, you know, maker side of it, the company side, who’s knows the domain and someone who knows how to listen and ask questions and follow up and sort of facilitate this.

    When I see researchers kind of getting immersed into a domain, they do build up some competency, but some of these things are decades of specificity and really kind of elusive stuff. So I think just to go back to your question, I think teams where there’s bandwidth for collaboration and you can bring in people with different perspectives, different domain and process expertise to create a great interview for the customer that you’re talking to.

    Like it’s a good experience to talk to a researcher and a domain expert because you just, you can watch who they make eye contact with as they kind of see like, oh, you’re the, I’ve had people even tell me, oh, okay, you’re the question asker and you’re the person that knows that you’re the engineer. Like people can figure that out. And it’s a really, nobody’s pretending to be anything that they aren’t.

    And it really, I think can be very harmonious, but you have to create the bandwidth that kind of support that collaboration on the team. So everybody can work together to get the insights that we want to get from the people we’re building for.

    Again, that was from the CX Chronicles podcast. Now let’s get to my conversation with Celeste Ridlen. She’s the head of research at Robinhood.

    Celeste, thank you so much for being on Dollars to Donuts. It’s really great to have you here.

    Celeste Ridlen: It’s awesome to be here. Thanks for inviting me.

    Steve: Can we start with an introduction from you, build the rest of the conversation off from that?

    Celeste: Yeah. What would you like to know in my introduction? I exist.

    Steve: You exist.

    Celeste: My name is Celeste. I’m the head of research at Robinhood. I’ve been doing research for 15 years now. My background is in human factors and ergonomics. I live in San Francisco, long walks on the beach, that kind of stuff.

    Steve: How did you discover the field of human factors?

    Celeste: I was in a cognitive psych lab, like working in a cognitive psych lab at Florida State University, and I was trying to think about my next step. I liked to joke that with a psychology degree and an English degree, you basically are qualified to be a mall security guard. I was looking at grad school and I decided that I was going to talk to Dr. Kashak who was running the lab at the time about my interests, because I had tried on neuroscience and I didn’t want to hurt animals and place electrodes on rats’ brains, so I cast aside neuroscience and then social psychology. I worked with Baumeister and Tice and that was interesting, but also so vague and not applicable, and much love to social psychology, but just so vague. I was in Dr. Kashak’s lab and I was asking him about what his advice would be for what I should study if I was most interested in cognitive psych. He asked me if I liked technology, if I liked industrial engineering, like making things. We had this great conversation about human factors, which is essentially cognitive ergonomics. As a field that started blooming right around the time we started industrializing weapons, which is a weird historical fact that I got excited about. That’s how I got into that.

    Steve: How did you find a program for yourself? This was grad school. You did choose grad school.

    Celeste: I did. Well, no, there wasn’t any, I mean, now there’s HCI and stuff like that, but at the time there weren’t programs dedicated to that yet. I looked around, there were a few programs. One of them was at Georgia Tech. One of them, I don’t even remember where they all were, but they were approximations of what I was looking for. Some of them were called human factors, some of them weren’t. The one I ended up going with was at San Jose State, and yes, right up there in terms of name and stature with Georgia Tech. But I chose San Jose State specifically because they had an applied, terminal master’s degree program, there was an applied emphasis. Where they had relationships with tech companies in the Bay Area and NASA, and they were working directly with these companies to get students into exactly the jobs that I was excited about. That’s why I chose it. I knew I didn’t want to be an academic. I had an end goal. I didn’t want things to be vague. I wanted to see the fruits of my labor immediately. Then I, sight unseen, came out to California, have never looked back.

    Steve: Was there a first applied job or project or something that came to you through that program?

    Celeste: Yeah, a lot of people had stuff come directly from school. I happened to sit next to somebody in one of my classes. I mean, I don’t know if it was like a stats class or I don’t remember which one it was. But there was somebody who was already working at Oracle, and she let me know about a job opening. And I wasn’t even done with grad school, and they took a chance on me. I was a contract physician. In retrospect, it was probably like a low lift, but it was my first sort of foray into things. And so, no, it wasn’t like a direct, it wasn’t directly because of like this, the program itself, but it was the people inside of it. Like you literally never know who you’re sitting next to. So it was lucky for me.

    Steve: What kind of things were you doing in that first Oracle job?

    Celeste: Oh, God. One time I transcribed an offsite, like a field visit that I did not attend. So I had headphones on, and I had to hand transcribe 20 plus hours of people spewing acronyms that I didn’t know or understand. So that was fun. I did a lot of like participant recruiting. This was a joy, but definitely a labor of love there. I did a lot of like synthesis and conducting studies, but I did a lot of the stuff that people either have automated now or would not consider in like an entry level research job now, which is character building, let’s call it. Without sort of going through your resume step by step, but what’s,

    Steve: I don’t know, you can assess what the next marker is like, what’s another role that came after that, that was significant for you?

    Celeste: Yeah. So I hopped from that to a full time position at Symantec about a year later. I don’t even think Symantec is a company anymore, which is at the time it was like 25, 30 years old. And so there’s sort of like two camps in tech, or at least there were at the time. These old, like slow moving companies with long, long product cycles, like Oracle, Symantec, it’s like there’s a lot to an implementation. And so you’ve got like a five year release cycle, like it’s really, really long. And then you had new companies that were very young, right, like on the edge of being what I would call startupy, like where it’s, everybody’s doing everything. And so I jumped from Symantec, which was like, it was a great job. They also took a chance on me. And I shifted into working in sort of like the newer building phase of tech where like there were almost no researchers. And we had to sort of build a process and perspective and relationships as the company’s like trying to grow like crazy. So Twitter was a really, really pivotal point in my career where I got really, really excited. It was like a dream job. I always wanted to work on Twitter. Like it was, you know, this 2013 Twitter, that means something different now, for sure. But I was so excited. And it was amazing to be surrounded by people that had that much passion and energy. And I, you know, I was part of that bricklaying process, right? Like I had a boss. And I loved it so much that I got invited to do another bricklaying at Airbnb a few years later. I was at Airbnb for a very long time. We went from 20 researchers to like, I don’t know, over 100 at our peak. So that was a lot of bricks.

    Steve: Can you explain the bricklaying metaphor?

    Celeste: Yeah, what? That doesn’t make sense to you?

    Okay, so what I mean by that is you have this sort of like nascent or non-existent research team, like research function, let’s call it like a discipline. Maybe there’s one person, maybe there’s five people, but everyone’s just kind of like still being very reactive because the company doesn’t know how to work with research yet and doesn’t really know what part of the culture it fits in, where it fits, so on and so forth, like how to engage, that kind of stuff. So what I mean by bricklaying is like there’s process, right? So how do you recruit participants? What are sort of like the safety issues with your research participant agreement or your NDA? What are the safety issues with the way that you reach out to your participants? So I was like building a lot of programmatic structure on top of then hiring people, trying to identify and prioritize research questions, all the things, like all the how-to’s, all the — so I was interviewing like a million people every week. I was participating in like a crap ton of interviews for the company and then also of the users. So it was just nonstop assessing things, basically, but figuring out like do we have a crit, as an example, as a research team? Do we do like a weekly critique? What does that culture look like? Is it required? Are we forcing you to do that? Those kinds of things are the bricklaying. Like they’re all part of the bricklaying. It’s not just like hiring and setting up process, but it’s also like what kind of culture do we want to have and what’s mandatory, what’s optional, and like what needs to be grassroots versus top-down, that sort of thing. It’s fun. It’s really, really fun.

    Steve: I don’t know if this is part of the bricklaying metaphor, but what about things like how much we know or don’t know about people that we’re building for?

    Celeste: Yeah, that’s a big one. I mean, when you first start — let’s keep going with the bricklaying metaphor — when you first start, like you could put a brick anywhere, so to speak, and you would have impact, right? Because you didn’t know anything and now you know something. But that gets like more complicated and more nuances is necessary when you start, when you’re sort of saturating on a particular topic. There is a moment to either move on or rethink it. And so, yeah, that’s definitely a part of it.

    Steve: Right. Now you got me thinking of the Tetris metaphor where the more bricks you put down, the more difficulty placing those future bricks. There’s more scrutiny and impact on that choice.

    Celeste: Yeah. Yeah. I’m very good at Tetris. I don’t know if I’m good at bricklaying, but this is the third company I’ve moved to where I felt like that was a part of what my role entailed. Because it’s not just in research, too. It’s the company at large. You’re defining process and practice as a group of people who’s maybe under a thousand to then like three thousand, five thousand people in a very fast — like the only way to address any of that is to just figure out how to scale yourself and figure out what’s important and what isn’t, because you have to make some decisions very quickly there.

    Steve: I do want to move on to Robinhod in a moment, but I have a question that I want to go back. So what’s your title at Robinhood?

    Celeste: I am the head of research at Robinhood.

    Steve: And hearing you talk, I’m inferring that bricklaying is not just the purview of people who have leadership titles. You’re describing Twitter and Airbnb as roles where you didn’t have a leadership title, I think, but your job bricklaying was part of what you were doing. It was the context for your work.

    Celeste: Yeah, you and I could go on forever on a much belabored subject about the difference between management and leadership. I was a manager at Airbnb, but you’re right that the bricklaying metaphor is not specific to somebody with a leadership title or somebody who is a people manager. It’s something everybody has to build together. And it depends a lot on the chemistry of both the research team and the broader company you’re at. So it never looks the same way twice.

    Steve: Well, I’d love to hear if it’s possible, you know, different companies, different roles, but being an individual contributor, being a people manager, being a leader, again, the companies are different. So maybe at the question, the comparison isn’t right. But would you be able to characterize either ways that one could or ways that you have been involved in bricklaying? I love the metaphor. Just these three different contexts for bricklaying, you’re coming into it with what seem like different responsibilities or titles.

    Celeste: Totally.

    Steve: What is it like? How do you compare and contrast across the three?

    Celeste: I think in all three cases, the similarities were around, no one’s going to tell you that you absolutely need to be doing that unless you’re like, you know, the pet of something that is kind of objectively, everything is your job, right, to some degree. So I think that one, it’s a little bit easier to believe that everything is within your purview. But for other roles, I came in and there are opportunities, there are problems everywhere, and you can either decide that they’re not your problem, which is an approach, or you decide that you’d like to solve it. Nobody’s going to probably yell at you for solving it. So you take it on.

    So things like rewriting interview questions, like that’s something small that I did. Creating processes and how-to docs was something I did like every five minutes at Twitter. And there was like a running joke that no one was going to know how to do anything after I quit, because there was no one to write how-to docs anymore. I’m sure they wrote them plenty.

    But I think it’s just about deciding if you have a perspective or a skill that can be lent to that particular problem or like opportunity, and then just doing it. I mean, there’s nuance there, right? You have to ask questions and make sure you’re not hurting anybody’s feelings, someone else isn’t working on it, that kind of thing.

    But a lot of times in those environments, the best thing you can do is just say, “Hey, no one has a problem. I’m going to do this. Good? Good? Yes? Anyone want to weigh in? No? Awesome. I’m going to take this on.” And then everyone is pretty thankful if they remember that you did it at all, which is, you shouldn’t be doing it for credit anyway.

    Steve: Over the course of your history, have you worked in environments that weren’t in the bricklaying mode? It seems like the first two you described were that.

    Celeste: Yes. Yeah. So both Symantec and Oracle had very well-funded and staffed research disciplines. They weren’t massive. I think Symantec, there were still only like six of us, but it’s not enormous for a 20,000-person company at the time. But there was already a way of doing things within the broader UX team, within the company. We had a process, we had practices, we had tooling. No one was starving for those sorts of things. And so everything was kind of plug and play. There were great jobs, but the environments themselves are different. You learn different. You get reps for doing the work there a lot more than doing the work and the stuff around the work, if that makes sense.

    Steve: If I’m applying for a job in user research, let’s just imagine that’s the thing I’m doing. It’s like, ask your doctor if Flonase is right for you. Ask what kind of environment is right for whom? Having seen both and participated in both these, if there are only two, and I’m sure there’s nuances here.

    Celeste: There’s a lot.

    Steve: Yeah. If places where it’s more plug and play versus places where there’s a lot of problems to be solved and opportunity to solve them, what’s your advice for people into, I guess to me, I hear two parts. How do I assess what the situation is? And then how do I self-assess about what works for me?

    Celeste: I think it seems to me like there are a lot of ways to do all of those things, but I’m going to tell you a couple of things. One is how to assess or how to self-assess. But then the second or third, depending on how you just numbered those in your head, would be people that I’ve seen succeed in either of those situations, of which there are probably shades, it’s a spectrum or something like that, because they might be different too. So when I think about assessing, when you’re interviewing for a job, I think the thing that I would look for is who’s funding the research team and why do they think that a research team should exist? Those are things that you would think you get the same answer every time, but you extremely do not. Sometimes people want a research team or have a research team because it’s a box to check. This is what we do to make products. This seems like a thing past a certain point of company maturity that we would definitely want to have. So I’m not entirely sure, but I’m going to, here you go. I’m going to just check that box and it’s going to be great. And you’re going to be my researcher. So there’s stuff like that that can kind of give you a glimpse into the research maturity of the organization or whatever you want to call it.

    Another question I like to ask, so around funding, do you have the research? Do you have the tooling you feel like you need to do your job? If you needed a new tool, what would it take to get it? Is it a procurement question? Do you have a budget? Do you have to go talk to somebody else about budget? Those are other things.

    And then asking questions even about headcount. So like, how did this role open? Why does someone think it’s important? Why did someone decide to do this over hiring another engineer? Sometimes those yield interesting answers too. And you can kind of tell where in a research maturity a company is based on that. What I see in terms of success for one sort of a person over another, I hate putting it like that because I think you can be a different sort of person at different places in your career too. But if you’re interested in sort of the bricklaying, let’s call it, I’ve seen people say they want to do that. And then they jump in, they realize like, oh shit, nobody writes anything down here. There is leadership lacking and cross-functional ways. And I can either take the reins or just get really upset. And you never know how you’re going to react in that situation until you find yourself in it.

    So I’ve seen people who were like, yes, I want that. I want that. I’m excited about that. And then they get there and they’re just very uncomfortable, very frustrated. And what they actually wanted was the excitement of moving quickly, but they didn’t understand all of the things that surround that, that you need to take along with the moving quickly and the excitement.

    Steve: What led you to Robinhood and the role that you have?

    Celeste: Yeah. So this is a complicated question. So Twitter, Airbnb, and Robinhood, while they are pretty disparate in terms of topic, like your customers, things like that, they’re not in the same vertical. The similarities they have besides the company phase and stage is that they’re all very, very mission driven, like very strongly mission driven. And once you’ve worked at a company like that, I should just speak for myself here because maybe this is not exciting to everyone else, but once I worked at Twitter, I was like, I can never have anything but this ever again. Because when people really believe in it, when people really believe in the work you’re doing and believe it’s for a purpose that isn’t just like contributing to the capitalist abyss, it’s more motivating and more exciting to me to get up every day and like, and focus on a mission. It’s what I return to when I’m trying to prioritize things, when I’m feeling like I don’t know what to do. It’s a nice way to hold the center.

    So on top of that, when I was leaving my last job, I interviewed, you know, there are many mission driven companies. And so like I interviewed for some heads of research roles actually, and I was even surprised to be offered some. I was leading the largest team at Airbnb when I left. It was like, it’s called hosting, which was Homes, which is the biggest part of the business, Homes, Community. We had an Olympics team and then experiences was also a part of it. I just felt like I was collecting Pokemon at that point. Like they just kept giving me stuff.

    But so I basically decided that I didn’t want to be a head of research. This is why this is a funny story. I opted for Robinhood for the mission. And also because I was excited to lead research and like a lateral move across a few teams reporting to the head of research at the time. It was okay with me because I had all these like ideas about what being a head of research would mean, which is why I didn’t want to be one. So like I won’t be as close to the work anymore, or I’m going to spend all my days like writing and rewriting career frameworks, or I have to be super front and center. That’s another one that like comes up for me all the time. I’m not, I’m chatty, but I’m like fairly introverted. And I don’t personally love being like the star of the show.

    So I was also kind of looking at people that I had reported to over the years and or known that like then went on to be heads of research. And I just couldn’t see myself doing what they were doing, or at least what I thought they were doing. And anyway, this is all about me joining Robinhood. But basically, through a series of twists and turns in my first few months, I found myself in the role I was avoiding, which is fun.

    But looking back, I think my biggest gap in thinking at the time was that I forgot or I didn’t know that leadership roles are what you make them. So this isn’t very researchy, or even like UX-y. I think this is just like a fundamental leadership-y thing where no two people are going to do that same leadership role, whether it’s the head of research, the CEO, the COO, whatever, because those are all equivalent roles, right? The same way. Like you’re never going to do them the same way as somebody else. And that’s actually a really good thing because the situation may call for exactly what you can offer. But because of that, if you’re looking to other people to decide like whether or not you’re going to be suited to doing that role, it’s kind of like thinking about whether or not you should be a writer based on whether or not you can write exactly like, I don’t know, Mary Shelley. I love that that’s the first one I thought of. But she did it her way and she wrote Frankenstein, right? And then you’re going to do it your way and maybe not write Frankenstein. And just because you can’t write Frankenstein doesn’t immediately invalidate whatever it is you are going to write.

    So that was a really long answer for why are you doing this job? But it was like a personal growth moment where the very thing I was avoiding, I had to confront. And I learned a lot about it.

    Steve: Maybe there’s some blurring between what we think we can’t do and what we think or we know that we don’t want to do.

    Celeste: That’s beautiful. Thank you. Oh, yeah. I agree.

    Steve: Because I think you’re talking about like you’re looking at other, right, we all compare ourselves to people, whether we compare ourselves to Mary Shelley or someone who’s been ahead of research, but you’re calling out just to reflect back and kind of make sure I understand. You’re kind of calling out, well, you didn’t want to do the job because you didn’t want to sort of spend your energy and time and focus doing the things you saw other people doing. But the aha is that there’s a way for you to be being a leadership role that isn’t those things.

    Celeste: That’s exactly it. The things that you’re good at are the things you should be doing as a head of research or as literally any other leadership role. And you should be surrounding yourself with people that are good at the things that you’re bad at, because together you’re going to make beautiful music or write Frankenstein, you know, whatever. Right. Whatever you want to extend there. I really like that that’s the first person I thought.

    Steve: That’s a good improv moment. Don’t think about it. Just say it. I want to ask about “mission driven.” I think you’re saying, you know, once you sort of had a taste of that at Twitter, that became important to you. It was. And you wanted to. I guess I want to ask, is there a distinction between mission driven as just kind of a cultural quality and like the specific mission? Which of the bits of it are the ones that are calling to you so strongly?

    Celeste: No shade to the broader corporate lifestyle or maybe much shade. I don’t know. But I think when you’re every company has like a mission and a vision and like values and they there’s lots of pomp and circumstance around those things. And I think the distinction is in the discussions that they’re integrating into every day. So at Oracle and Symantec, as much as I loved the people there and I respect and admire the work that they were doing, we weren’t talking about the broader like, what are all of us collectively across all these products trying to do together? What is our ultimate goal beyond making money? Like what’s important to us? What is the legacy we want to leave as a company? And yeah, it exists somewhere written down in some wiki, corporate wiki. But like it just it wasn’t a part of the day to day conversation.

    Whereas like at Twitter, we were obsessed with being the global town square. We were obsessed with making it with enabling people to discuss and connect and like communicate. And it was really exciting and invigorating to be working side by side with people that were like, I don’t know how we’re going to do this, but we’re going to do it and it’s going to be amazing.

    And then at Airbnb, belong anywhere isn’t just like an advertising tagline. Like everybody is talking about the ways that we’re going to make people feel more or less like they belong based on design choices, strategic directions. Like it’s infused in everything.

    And it’s the same with Robinhood, ours is democratized finance for all. We talk about it literally every week. It is a constant in every meeting. It emerges, the language emerges, we’re weighing trade-offs and thinking about it.

    And it sounds a little culty at its worst, but I like to think that at its best it’s a force for good. I think you are what you measure. And so when you’re running a lot of experimentation and you’re looking at all these metrics that maybe build up to something that you didn’t actually want to aim towards, but just made sense in the individual examples, you can return to the mission as like, okay, but are these metrics democratizing finance or are they doing this other thing over here? Have we lost sight of it? Whereas at Oracle, at Symantec, I felt like that was not necessarily the lighthouse.

    Steve: Let me throw a different metaphor in here because we’re doing so well with them.

    Celeste: We’re killing it with the metaphors.

    Steve: I wouldn’t go to a, well, I guess we’d have to turn on a time machine, but I wouldn’t go to a Grateful Dead concert for absolutely anything. But if I was going to go to one, going with a friend of mine that loves music and loves the dead would be the way to do it, to be in that experience with someone who is into it. I’m using that as an analogy for the mission-driven thing. I don’t care about if the dead and whatever the dead is about is the mission. I don’t actually care about it. It’s not my mission. Belong anywhere might or might not be my mission.

    But if you’re going to go to a concert or if you’re going to work in an environment, one where that kind of passion and commitment and attention to detail and thoroughly building out every aspect of decisions being made based on that, you’re kind of highlighting how powerful that is and how rewarding that is. I guess I’m asking, is that still true if the mission is one that you are ambivalent about, say, versus 100% bought into? That’s what I’m probing on here.

    Celeste: Yeah. First of all, I love the metaphor because you’re right. You could go to a show, and if you went to the show with your best friend and they were crazy about the dead, you would have a completely different experience because energy is a very human thing. Energy is infectious that way. The passion, the enthusiasm, you can’t help but kind of like, “Oh man, I was going to say ‘Catch a whiff.’ That’s a little too on the nose with the Grateful Dead reference.” But we’re going to do this all day.

    Yes, I actually do think that if you are fairly ambivalent about the mission. I don’t think I woke up in 2021 and was like, “You know, what really needs to happen in the world is finance needs to be democratized for everyone, for all.” I don’t think that I woke up feeling that way. I definitely don’t think that I looked at Twitter’s mission and was like, “Man, I have never been more like — it is my life’s calling to be a part of the global town square.” I think the difference is that the mission has to resonate at least a little bit. You have to be like, “Yeah, of course. I mean, yes. Do I agree with the idea of democratizing finance for all? Absolutely. I love that.” I didn’t know I wanted to do that until I started looking into it. But it doesn’t make it any less important to me, especially if I’m surrounded by people that are all agreeing and believing in the same thing. So it helps, but I don’t think — it’s not like you have to be born with that passion in mind in order for it to be infectious. I was looking for a less disgusting word, but that’s what we’re going with.

    Steve: When we’re talking today, start of spring 2024, how long have you been at Robinhood?

    Celeste: Oh boy. How long have I been at Robinhood? I’ve been at Robinhood for two years, and I’m sighing deeply because every year is getting shorter in my life. Just like Pink Floyd promised. I’ve been here for two years. I just celebrated my two-year anniversary in the beginning of December, so a lot has happened during that time. I started in December of 2021. A few months after that, the head of research left.

    My job changed every two to three months for probably over a year, which isn’t inherently weird, except that 2022 happened, which meant that the crypto markets and the stock markets, everything was going down in 2022. There was a huge wave of layoffs across tech. We also had to lay people off. All of that is super difficult as an employee, but also as a leader of a team. It’s really, really tough, especially when you spend some time thinking that this is not what you wanted to do. So when I look back at the two years, it feels like more than two years, but it also feels like I just blinked and two years passed. So there’s a lot of cognitive dissonance in that length of time.

    Steve: What’s the cognitive dissonance?

    Celeste: It feels like a lot and like a little at the same time.

    Steve: Yeah. So over those two years, what are some bricks that you’ve laid?

    Celeste: So the team predates me by a lot. It’s not like I built the team and built all these processes. The team predates me by a lot. It was originally put together by my predecessors. I deeply appreciate everybody who did that for what and who they left behind. Those have been gifts. But the bricklaying in this case was that Robinhood had just IPO’d over the summer and was sort of nestling into everything changes when you IPO.

    Airbnb IPO’d while I was there, Twitter IPO’d while I was there. Everything changes. There’s a lot more you need to do. Because your funding looks different, there’s a lot more that you need to be held accountable to. And so a lot of process changes when that happens. The research team had also been through a lot. There were some big, dramatic leadership and team changes during the entire year of 2021, and I showed up in December. So a lot of my bricklaying, I wouldn’t say it was like building a team from the ground up, but it was sort of like there was a lot of healing that had to happen as a group of people.

    When I started, I was hearing a lot about people not trusting each other or feeling like if so-and-so got promoted, why didn’t I? I don’t even think their work is good. There wasn’t really very much teaminess. So a lot of the bricklaying was rebuilding cohesion and trust, thinking about things differently, re-evaluating tooling because everybody, when you’re laying people off, are also reassessing the budget and the tools that you need or don’t need. So a lot of really tricky stuff like that happening at the same time. And then kind of re-evaluating research’s relationship with the company, which was a pretty tall order, but it’s been fun.

    Steve: What are things that you can do to build teaminess?

    Celeste: Not sure I have a good answer to this because I’m going to give you that delicious research answer, it depends.

    Steve: Yes!

    Celeste: You’re welcome.

    Steve: You’re going to ring a bell right now.

    Celeste: Yeah. Someone somewhere is furious at me for this, but it depends on the situation. We had layoffs at Airbnb when I was still there and it was devastating and we had to sort of rebuild our emotional baseline also. And it was tough to find all the loose ends and figure out what work still needed to be done, what work had fallen off and that was okay, who we were without the people that we had lost and things like that. It’s really, it’s hard to be laid off and it’s very hard to lay off. It’s just like a no one wins situation.

    So in that, in like the Airbnb case, the teaminess came from just, actually it might be the same as Robinhood, being consistent. Everybody showing up and being exposed to each other and just talking about what was hard, what wasn’t working, do people have ideas, here’s what I’m doing, what are you doing, but being really active and pushing for contact and like pretty regular consistent contact and trying to foster moments of recognition when something was going well. There was a lot of stuff like that.

    I don’t know if I have like a silver bullet answer though, because it depends so much on like the way that things are bad or that need healing. We definitely went from people not being willing to help each other because they were worried about not getting credit to people helping each other without thinking twice about getting credit and those were signals to me that we were like nature was healing. We’re on the right track.

    Steve: I know you’re apologizing a little bit for the answer not being like a full list of three things like you didn’t say, oh, we had an offsite or we had a cake, so it’s like, but I really am struck by the fact that you’re talking about like intentional ways of being that are maybe smallish, but that are sustained over time. But that sounds much harder and much sort of less obvious to come up with or to execute. And that seems like, well, if you want to make change, it does depend, but there’s a set of tools that you’re drawing from and a set of principles is, I don’t know, subtler than a fix it kind of approach.

    Celeste: Fixing it, especially with something as fragile as like people’s chemistry and sentiments feels like a fool’s errand. Like coming in and being like, let me show you how to do things. I’m going to fix these feelings. That’s not really, I don’t think you’re going to get anywhere if you approach it that way. Someone has probably done that, but I don’t think that’s possible in the toolkit that I have. Maybe Mary Shelley’s done it.

    Steve: She was a great management and leadership consultant in her coaching business.

    Celeste: Yeah, good call. What comes to mind is that there were two things that were similar about both circumstances. One was sort of the community that I’m talking about, which like you can’t inauthentically build community. You have to push it through connection. You have to get people to connect with each other. And sometimes it’s awkward and it’s not mandatory fun. That doesn’t work. But if you can give people real reasons to show up and be there for each other and be honest with each other, you’re going to, it would be hard for that not to move in a positive direction. I think human beings just crave connection and community. Even the introverted ones, it turns out.

    But the second thing is like, this is a businessy answer, but just hear me out. Transparency. So here’s what I know. Here’s what I don’t know. Here’s what I’m doing. Things like that are during, especially things like layoffs or moments where the team has really lost a lot of trust in either each other, the situation around them, the circumstances. They know that you’re doing everything you can to contribute to a shared understanding. It’s again, it’s hard not to move in a positive direction if people are like, you’re being as honest as you can with me. And that opens the opportunities for, I’m having a hard time with this and I didn’t want to say anything until I felt like you were also showing up and spilling your guts about what’s happening. So yeah.

    Steve: You also mentioned re-evaluating researchers’ relationship with the company. What is that about?

    Celeste: The company loves research. I don’t think the company will ever stop loving research. Some of that, again, love dearly my predecessors and their approach to it. I got really lucky, but, and it was always going to have a place because our co-founders started doing research themselves. When they started the business, they focused a lot on listening to people, on observing people do things, like on very researchy things. So the research team was always going to be a core function at Robinhood. I didn’t invent that. I’m not going to create that. But when I started, there was this, they had grown a ton very quickly, like the entire company did. And every kind of PM, GM, executive marketer up and down the chain was and still is asking about research. We need to do it on every last thing.

    We have an unholy amount of opportunities to be user-centered in what we’re doing. And the tone for that is consistently being set at the top, which is delightful. And I don’t feel like I’ll ever see that quite in the same way in my career. So it just feels like I’ve struck gold. They’re always asking about our customers and their perspective. They know that our co-founders are doing the same. Everybody’s asking about this.

    But when no one ever questions if research should be involved, and if everyone is insisting that research study everything all the time, you’re faced with a different set of problems. Your relationship to the organization is different because researchers, as I’m sure you’ve seen this too, we’re so happy to be included and consulted because we’re so unused to that that our relationship ends up being, yes, I’m going to say yes to everything because I’m so excited to be in this position and be consulted and be listened to.

    And so that’s going to take anybody who starts at Robinhood by complete surprise. Everyone’s always like, “You said we had a seat at the table, but holy smokes, we really have a seat at the table.” Not prioritizing, not saying no. You end up spreading yourself really thin. You end up studying things you really don’t need to study because the leverage isn’t there. It’s not some sort of force multiplier all the time. Not everything needs research.

    Actually there’s a general manager that I work with that loves to use this metaphor. Speaking of our metaphors, he likes to say, “If I’m opening up an ice cream shop, do I really need to do research that I should offer vanilla? I just know that vanilla should be on the menu. So do we really need to do research on that and why?” My response to that, of course, is like, in the vanilla answer, sure, but do we know that where you’re opening the ice cream shop is actually a place where people are interested in vanilla? Because you’re assuming a really narrow, maybe it’s Americans only, but whatever. There’s a group of people that like vanilla generally universally, but very generally. Is that target market where you’re opening up your shop? On top of that, is it a fancy vanilla? Are we using elaborate beans from some rare island or whatever? Or is it this just straight up vanilla, no frills? What is resonating? What’s needed based on the context of the situation?

    But also I kind of agree with him that maybe we don’t need to do a study about vanilla at all. Maybe we need to understand everything around the vanilla. And that to me is I think the relationship with research that is ongoing and needs to change. If you can’t, as a product manager or some other leader, have a perspective on something without research, I feel a little uncomfortable with that. You should have a point of view. I want information to inform it. But if I don’t, because it’s a fairly inconsequential thing, I think you should still be able to make that choice anyway so that I can work on the stuff, my team can work on the stuff that’s the most important based on the things we’re good at. Not every question is going to be able to be answered by research. So that’s what I mean by the relationship. And it’s nobody’s fault that this is the state of affairs. It’s just what happens when the pendulum swings really hard in the other direction.

    Steve: Right. You talked a little bit about saying no, but in the vanilla example, it’s almost like, hey, here’s seven more questions that are harder to parse out, riskier if your assumptions are wrong, and that create more context to the vanilla question begs a bunch of larger questions. And so that’s not saying no, though. That’s like, no, we’re not going to do the vanilla thing. But hey, how’s about…

    Celeste: There you go. That’s what I mean. It’s not that I think we should be disengaging from every part of the conversation. It’s just that we need to be asking the right questions. And sometimes the questions that we’re being asked that we say yes to are still vanilla ice cream questions, like just the baseline yes or no vanilla without the nuance, the subtext, all the stuff around the things. And again, it’s just because research is such a part of the product building and marketing culture at Robinhood that people come to you with very strong requests about doing vanilla ice cream research. And we just have to keep making sure that we’re working on the things that are the most important from a business UX timing perspective to make sure that we’re not just answering lower leverage questions. What’s the approach to changing that relationship in the way that you’re articulating?

    I spent a lot of time with product leaders and GMs and things like that talking about what the most important decisions they need to make, questions they have are, and then kind of going over… I’ve been made fun of before because I’m like, that’s an experimental question. Like straight up, just run an experiment. It’s faster and more reliable. You’ll have more confidence if you run an experiment. People laugh at me because they’re like, well, wait, don’t you collect data a different way? Yeah. I mean, just because I’m a hammer doesn’t mean everything looks like a nail to me. So I spend a lot of time doing that because it does need to come from the top. But then making sure that people have the tools to be able to feel like they can say no because a lot of researchers feel like it’s going to damage the relationship between them and their product teams if they’re not able to do everything that’s being asked of them, which is disheartening. So we work a lot on that.

    We talk a lot about what a higher leverage question looks like or how to meet the short-term moment of what the team needs while making it into a longer-term study also. So I don’t mean in length of time, but I mean, I’m sure you’ve heard this a bunch, Steve. You know how there’s this debate within research about whether strategic or tactical is a better use of our time? I find that interesting because I don’t think they are separate. I think that you can make sort of like what people would refer to as very highly tactical research like, “Do people understand this? Whoa!” You can make it strategic by asking questions inside of it and putting together a broader narrative. If you’re telling me that like, “I’m too senior to do usability testing,” or whatever it is, I’ve heard that multiple times, which is unholy. But it tells me more about how you’re thinking about structuring your research in a way that you’re thinking about insights and what value you bring to the team. It tells me a lot more about that than it does about your seniority or anything else. And so it’s not about like doing less usability research, but it’s about doing the research that really needs to be done. And so we talk about how to prioritize, we talk a lot about like this over that, and we’re just pretty open about what’s important and why. Back to your point about transparency, I think. I’ve been wrong before.

    Steve: Yeah? Not in this conversation.

    Celeste: Probably several times in this conversation, but never about Mary Shelley. You’re welcome for that.

    Steve: I’m thinking about this relationship between researchers on your team and any particular product team and so on, where, yeah, what you said, there might be a near-term question, but there’s a way to do that research that also… \Either you refactor that question into a higher value question, or you do, and you didn’t say this, but I am going to say yes and, you do kind of a yes and of the short-term implication and the longer-term implication. But just this whole exploration of like, what are research being asked for? What does research provide? Just makes me think about questions that come up, like, do we give recommendations? Like what are the kind of outputs of research? And I know it depends. Of course it depends. But does this, I don’t know, does this area provoke anything for you, a point of view, or something you’re thinking about?

    Celeste: On the topic of recommendations, I’ve seen at some companies the idea that if people aren’t taking your recommendations, that you have not had impact, which I find charming. Like I personally do not want to work somewhere. I’d be really concerned if someone was taking 100% of my team’s advice all the time. We’re looking through a really specific lens. There’s all these other inputs to consider. It’s really great when the research is informing a decision, but sometimes the decision is not going to go with what the research is recommending. And I don’t think that’s a failure on the researcher, as long as everybody’s aware of the trade-offs and everyone understands the insights.

    And by insights, what I mean is like what the data itself translates to for us as a product, as a business, whatever the case may be. It’s okay to have wrong recommendations, but it would be weird if you were taking my advice all the time because it would mean that you’re really only considering one input or weighing it heavier than others. So that feels weird, but I do think, to your point about it being a bit of a controversial topic, I do think we should have recommendations, like be giving recommendations, not because we’re all geniuses, although maybe that’s true. But because we are the most informed about what the data means, what we can say and what we can’t say based on what we did, thinking about things like intellectual honesty and rigor and all of that, I don’t know why we would ever think that we were not suited to making some sort of recommendation.

    Because other people are going to take the information that you give and then just make their own recommendation, but that’s filtered through an entirely different lens with way less of the broader context that you’ve collected. It’s okay if your recommendation is bad, and you should probably shape it in a way that makes it shelf-stable, right? If you say, “Move the button to the right,” that’s not shelf-stable, because at some point someone will probably look at your research years down the line and go, “What in the hell were they talking about when they made that recommendation?” But people preferring design A over design B is just sharing data. People preferring design A over design B because the information they were looking for was a lot easier to find, that’s more shelf-stable, right? You understand why, you understand the context of the insight itself, and even if I don’t do exactly what you say, maybe I end up going with design B anyway, maybe I can make design B better based on the way that you frame that. So there’s a lot in there.

    Steve: I want to pick out what a recommendation is, because people prefer design A over design B, that’s data. People prefer design A over design B because they can find the information. Okay, that’s an insight. So then the recommendation would be, go with design A.

    Celeste: Right. Yeah, but you have to have that other piece in there for it to make any sense at all.

    Steve: We should go with design A because it helps people to find the information they’re looking for.

    Celeste: Words are hard, Steve.

    Steve: I don’t know, just as a counter-argument a little bit, and I think you’re on board with this, the risk of being wrong there is you only know what you know about that. So design B makes us more money, design B we can implement faster, design B is compliant with something, design B is consistent with what we’re doing in three other platforms.

    Celeste: Absolutely.

    Steve: So I think that’s an example of them not going with design B and that being okay because you’ve given them the information that you have to make that recommendation.

    Celeste: Yes. So I would argue that if you’re not incorporating things like which design would make us more money or if you’re not incorporating those pieces of information into your process, you’re leaving value on the table. You should be helping the person who needs to make the decision make it with all that context in there. Anything you can help them with. You have an amazing sense of synthesis and distillation and not everybody’s fantastic at that. So if you can pull that in, you probably should do that. But yes, you’re right that even if your recommendation is wrong, it’s not always that you’re completely incorrect. It’s like not right now because we would prefer to make more money with design B or whatever it is. But you’ve planted this seed and I understand that what people need is not what this is providing and so we can get successive approximations closer to it.

    Steve: Right. I think the recommendation discussion sometimes leaves me cold because it loses all the context that you’re providing. It’s the recommendation and the insight in the larger context of the set of things that might be factors for this team. And I think you’re saying you should try to know all of them.

    Celeste: Why not?

    Steve: I’ll just add there’s always going to be something that you won’t know.

    Celeste: Of course.

    Steve: I don’t know, it reminds me of like trying to write like Mary Shelley. I think if you do, if you do like creative writing and you do workshops like you get feedback that says, here’s what the problem is and here’s what the solution is. And I think it can be really helpful when you’re the writer to choose not to do that, but to do it, ignore the advice in an informed way. Well, that’s not what my objective is. That’s not my strategy. We need short term wins here, whatever. And I think we’re agreeing strongly here. And it’s just some of the language gets oversimplified when it gets, you know, boiled down to like a social media posts.

    Celeste: Yeah, I think this particular topic is one that is not great for like a LinkedIn rant. And yet, it’s all we see.

    Steve: But great for a podcast episode, right? Because we can dig in.

    Celeste: Sure.

    Steve: You brought up my other another one that I take umbrage at which is the, you know, not we’re not having impact if it doesn’t get taken up. And I liked what you said. I’ve also heard people say my research doesn’t have value. If no one takes action, it’s even more binary, right? If that my research impact is sort of higher level than value. My work is worthless. If someone doesn’t do the thing that I told them to do. And I feel so sad because that’s researchers saying that, like as if we don’t have enough people telling us we’re without value. We’ve sort of taken on as a profession taken on some of that, you know, willingly devaluing proactively. I’m going to devalue it for you. It’s depressing. And that makes me sad.

    Celeste: Yeah, I agree. It also not to I’m sure that the folks saying this are perfectly smart people. But to me, when I hear that kind of stuff, it feels like it lacks curiosity. So you’re saying that unless there’s like something I can see in front of me, or unless there’s like a direct action being taken, that I have failed when I think if you had a little more curiosity about it, you could notice little hints like, well, but these people are changing their language. And they’re using the language of the people that we’re studying instead of whatever business bullshit we’ve come up with. And that’s impact.

    I think there’s we’ve changed the roadmap where we’ve just decided not to go this route. We’ve saved three months of engineering. So it’s almost that inaction is action. There is there’s lots of stuff like that. And I think if you are open to seeing it, it makes itself known to you. But if you are not curious how your research is sort of creeping through an organization, you’re going to end up with it flying right over your head. And then you assume you have no value or whatever it is you said, it’s depressing.

    Steve: You know, you’re bringing your the same way that you talked about bringing teaminess is it seems very analogous to the way you’re talking about both creating and sort of realizing the value that it’s smaller signals over a longer period of time with some I’m not sure if the constancy thing applies here or not.

    Celeste: Maybe.

    Steve: But it’s a binary outcome kind of thing that yeah, taking a slower and more looking for looking for those subtle more subtle signals.

    Celeste: Yeah, you’re reminding me of I’ve had this conversation a few times after becoming a manager where people are like just shouldn’t do research on this area because they’re not valuing it the way they should, which often the subtext is they’re not taking my advice or they’re doing things the way that I fundamentally disagree with. You probably know this better than anybody, but this is not a field for the faint of heart. This is like a full on eternal marathon. It took me three years at Airbnb and many different approaches to convince people that non drip pricing was the right call. It cost us a lot of money and we had to find ways to offset that. And every time I got knocked down about it, I could have chosen to never bring it up again.

    But it’s chipping away. It’s finding advocates. It’s finding other people that agree or believe it. If I can convince one person each time of something that I thought that I believe in, I’m doing great. That’s impact. And so you’re right that there’s like little subtleties, but it’s also just a long game. Like all of this is a long game. You’re lucky if you have a short term win. You’re super lucky.

    Steve: Anything else to add today?

    Celeste: No, I mean, thanks for inviting me to do this. It’s very flattering to talk about my thoughts and feelings on things we mutually like for however long this just was. It was very nice. Thank you.

    Steve: Yeah, thank you. You shared a lot of interesting perspectives and good history. So yeah, thanks a lot for taking the time.

    Celeste: This was awesome.

    Steve: Over and out, good buddies. That’s all for this episode. Thanks for listening. Recommend Dollars to Donuts to your peers. You can find this podcast in all of the usual places. A review on Apple Podcasts helps others find it. Go to portugal.com/podcast to find all the episodes, including show notes and transcripts. Our theme music is by Bruce Todd.

    Celeste: Did you say a favorite puppet?

    Steve: Favorite puppet.

    Celeste: What came up for you just now?

    Steve: You know so many things about so many things and I was just trying to open up the possibility space.

    Celeste: Favorite puppet. I’m going to be thinking about that. What is my favorite puppet? Now I know exactly. It’s the, I can’t remember the other guy’s name, but it’s the two angry balcony dudes in the Muppets. One of them is named Astoria or something like that, or Waldorf, and the other is named after another hotel and it’s just not coming to me. But those are.

    Steve: His name is, his name is Statler. Statler. I just Muppet, Muppetsplained you here.

    Celeste: You win. You win the Muppets.

    Steve: Yeah.

    Celeste: I also just like that there’s no, nothing is sacred to them. They just go after whatever, whatever is in front of them. And they do a lot of like lame dad joke moments too.

    Steve: So. That’s excellent. I’m not going to get any actual puppet content.

    Celeste: Puppet content

    Steve: So thank you for, thank you for taking that literally. I didn’t know what you were going to share, but…

    The post 42. Celeste Ridlen of Robinhood first appeared on Portigal Consulting.
    12 April 2024, 5:38 pm
  • 1 hour 2 minutes
    41. Carol Rossi returns

    In this episode of Dollars to Donuts Carol Rossi returns to update us on the last 9 years. She’s now a consultant who focuses on user research leadership.

    I’m happy making the contributions that I’m making, even though it’s hard to directly measure impact. I’m hearing from people that they’re finding value from the work that we’re doing together. I want to leave people with this idea that the work that we’ve done together is valuable to them, whether it’s tomorrow, or two years from now they see value in it in some way that they couldn’t have anticipated. I’m trying to be as clear as I can about focusing in the areas where I think I can make the best contribution and have the most impact. And keep reexamining, how do I feel about the work that I’m doing? And what am I getting back from people? – Carol Rossi

    Show Links

    Help other people find Dollars to Donuts by leaving a review on Apple Podcasts.

    Transcript

    Steve Portigal: Welcome to Dollars to Donuts, the podcast where I talk with the people who lead user research in their organization. I’m Steve Portigal. In this episode, I catch up with Carol Rossi, nine years after she was first on Dollars to Donuts.

    There’s a bigger and better new edition of my classic book, Interviewing Users. As part of the launch of the book, I spoke with Russ Unger for his Tent Talk speaker series. Here’s a little clip.

    Russ Unger: What’s your approach to ensuring that the feedback gathered from user interviews is effectively communicated and incorporated into the design process?

    Steve: The first part of that I think is that you have to do something. You have to make sense of what you gather. Some of this kind of goes to maturity of any individual practice. I think the less experienced folks are, the more they want to just take what they remember about what was said and type it up. And that verb is, that’s stenography maybe, or collation as far as you get. You put these pieces together. And then you’re just taking requests or gathering complaints. You might as well use a survey for that. I think it’s the iceberg model, right? Some of it is above the surface, but a lot of it is below the surface. Below the surface means going back to what was said and looking at it and making inferences. What wasn’t said? How was it said? What said at the beginning and what said at the end? And that’s just within one interview. What did person A say? What did person B say?

    And there’s a whole new chapter about this. It’s the analysis and synthesis process. And some folks say that the ratio should be two to one. For every hour of the feedback that you gather, you should spend two hours analyzing and synthesizing. And I think in a less evolved practice, it’s the inverse. You might spend half an hour for every hour or even less. The caveat here is not every research question merits this. If we are looking for, I don’t know, choice preference between something and something else, we might be really clear about what that is. We come back and say, do this.

    But for anything where we want to understand why or understand opportunities or understand motivation, a new space you want to go into, characterize a customer that we haven’t worked with before, it really is worthwhile to go and do this analysis and synthesis.

    How do we have impact? We have to have something impactful to say. I just want to say that. Some other factors that I think can make or break it is working collaboratively with stakeholders, the folks that you want to inform, influence, take action before you do the research. And so having an understanding of what business challenges are or business goals, like what are we trying to do as a company? And then formulating really good research questions. What are we going to learn in order to inform that? And then choosing methods and approaches that can support that. And not doing that in a vacuum. And then this has the effect of switching your role from being proactive to reactive.

    I think it’s hard to have an impact with reactive work. Those requests that come are often late. They’re often based on a shallow assumption about what kind of value research can provide. And so you are going to give a thumbs up, thumbs down in some direction. So your sort of role as a provider of these kinds of insights is diminished. If you can be proactive, which means maybe understanding a roadmap or what decisions are being made or who else is going to do what and proposing research on your own roadmap that is intentional and is ahead of time, you leave space, of course, for things that come up, fire drills and so on.

    But trying to work in a proactive, collaborative way, aligning on goals and then putting the effort in to make sense changes the whole conversation about what you’ve learned. You get to that point of sharing with somebody.

    That’s part of a larger Tent Talk. You can check out the whole show and definitely buy your postal carrier and barista their very own copy of the second edition of Interviewing Users. If you want to help me out, write a very short review of Interviewing Users on Amazon.

    Over the last couple of years, I’ve been partnering with Inzovu to run training workshops about storytelling. Storytelling is an essential human skill that powers how teams work together with each other and with their colleagues. I’ll put a link in the show notes with more info about what I’ve been up to with Inzovu. And if storytelling is something you’d like to build up in your organization, reach out to Inzovu or to me.

    Okay, let’s go to my conversation with Carol. She’s a consultant with a focus on user research leadership. Carol, welcome back to Dollar a Donuts after nine years since we last talked. It’s great to talk to you again.

    Carol Rossi: Yeah, thanks, Steve. I can’t believe it’s been nine years.

    Steve: Time does fly. Let’s talk about those nine years. You know, what’s been the shift in your evolution in your professional world since then?

    Carol: When we last talked on the show, I was at Edmunds and I was leading the UX research team there. And I had been there at that point, I guess, four years. I had started the team there and then ended up staying at Edmunds until 2017. And then I took a moment, because I’d been there for quite a long time, and took a moment to kind of ask myself what I wanted to do next. I call it my gap year. So I did some consulting, some really contract work as well as like consulting, helping people think about how to set up a team.

    And then in 2018, I went to NerdWallet and that involved a move. So I was in LA for the bulk of my career. 2018, I moved to San Francisco for the job at NerdWallet. And that was an established team that I led for about four years. And I mean, we can go into detail about any of this stuff, but basically left NerdWallet in 2022 and started a consultancy where I’m now focused on helping companies, helping leaders know how to get the most impact from research

    Steve: Can we talk about NerdWallet a little bit and then talk about your consulting work now?

    Carol: Yeah, Sure.

    Steve: So it was an established team. Is that right?

    Carol: Yeah, it was. So there were three people on the team. There was actually an open headcount when I joined. We ended up doubling the size of that team. So we still remained a relatively small team, but we did get some additional people. We actually, I think some of the work that I’m really proud of there is that we went from having these researchers doing sort of very siloed work, or even though they were all researchers, they were hardly really working with each other even. And then developed that team to the point where we had a lot more strategic impact. We started a voice of customer program. Two of the people on the team became managers during the time that I was there. So they saw a fair amount of professional growth. And when I left, again, there was this voice of customer program established, as well as a program to train designers and PMs and content managers to do some of their own research. We had, well, on the market research side, they were doing some brand work. We were doing some kind of explorations about how that played out in product. So there were more things that are more sort of horizontal activities we were doing, and also empowering some people to collect their own insights, as well as deepening the impact of our team.

    Steve: When you talk about coming in and the researchers that were there were siloed, my mind starts to go to that embedded word and what that means. But I think you’re talking about siloed in a grander scheme of things. But I don’t know, what does siloed look like then?

    Carol: I think it’s a really good distinction. The difference between siloed and embedded to me is that embedded can be and is a very valuable way to participate in a product development team.

    So it’s like, and we ended up with this sort of hybrid model, I would call it. Because at the time that I left, the team was reporting to ultimately me, but they were dedicated to specific focus areas. So we had one person working on the logged in experience and that involved maybe three pods. We were calling them pods, but squads, whatever, product trio, whatever language we use to talk about the combination of the PM, the designer, the content strategist, and some number of engineers. So we’d have one researcher per, let’s say, three of those pods, but they were all within a focus area.

    So one was dedicated to the logged in experience. We had, for example, a couple people working on what we call the guest experience or shopping. So if you’re looking for, so I should say NerdWallet is a company that provides advice and products to consumers who might be looking for financial products. Consumers might be looking for a credit card or a mortgage or a personal loan or whatever. So you can either go and read some articles and then get linked to some potential credit cards for you based on what you’re interested in and your credit score and those kinds of things. Or you can download the app, log in, and get tailored advice based on your specific situation. So those are, at the time, were separate areas of the company in terms of the way the development was divided up.

    So I think embedded to me is there’s a very healthy relationship with those pods where the researcher is either dedicated to one or maybe crosses over a couple of those areas, of those pods. But siloed to me is people are working on something so exclusively that maybe there isn’t a lot of conversation across. And I think what you lose in that kind of model is opportunity to take advantage of research that might be going on in an adjacent area or even a very different area but has relevance to what you’re doing.

    And so you can have a lot more efficiency across the research function if you’re not re-doing work, you know. Or people are learning techniques from each other, you know. Or people are partnering so that there’s some broader impact across these different focus areas. So there might be — because to the consumer, to the ultimate user, the customer, they’re not seeing it, right, as these sort of separate areas. They’re seeing it as one experience. And sometimes in order to do product development, you have to divide things up.

    So how do we keep the flow and the things that need to be similar across the experience and have it make sense by looking for those areas of, you know, similarity or continuity or whatever the word is that you want to use there. Some of the things that we did that worked really well were have just — so first of all, I should just be really clear. Because it was a manageable team, I mean, small enough team, we could do things like have team time every week where researchers felt like they were able to have a dedicated, you know, I think it was an hour or something, but a dedicated time where they could talk about some of the stuff they were doing, present problems to each other, learn from each other, like have time to be able to say, I’m doing this thing, I think there might be some relationship to what you did last year or what so-and-so did who’s not even here anymore and what can we talk about there.

    So I think there’s — with a small enough team, you can definitely have people, you know, embedded or partially embedded within specific areas so they’re having maximum impact in those areas, but still conversation across. So I think that’s one thing that we did. Another thing we did was have kind of a loose repository. We weren’t using a really fancy tool. We just literally had, you know, a wiki where all of the research that was done was available. So people could go in and see what had been done and see if there was something relevant to them. And that could be like product managers, designers, anybody could go in and look at that and see. And then they’d usually come back and ask us questions. Hey, I saw this thing, you know, I wonder how that can be relevant to our team. So I think there are a few things that you can do.

    Steve: You mentioned that you put in programs to teach other folks who are not career researchers to do research. What did that look like? How did that work?

    Carol: I think the way that I’ve seen that work well is to create — when I’ve created a three-part series, workshop series. And so we start with these three workshops and then we do ongoing coaching. So it’s not just a matter of taking a, you know, a training session. And the first workshop is really setting up the research for success. And so that’s really about planning and study. So then there we talk about starting with the business objective, you know, like people will often start with a research question. Like we need to know X. Okay, well, why do you need to know X? Like there’s some business reason why you need to know it. So what’s the thing you need to know? Why do you need to know it? What decisions will be made as a result of that? And then what’s the best way to get that answer? Obviously, you know, in what timeframe do you need to know it and those things as well.

    But starting with that framework to give people an appreciation for the fact that we don’t just run a study because we have a question. We kind of put context around it. Even if it’s a lean and I’m using the language of run a study, but this is like the language that some people are using is having conversations with customers or collecting insights, whatever language people are using. It’s the same thinking. And in that first workshop, we talk a lot about reducing bias, making sure we’re not asking leading questions or, you know, the way that we’re writing a task or something that we’re going to put up on a prompt that we’re going to put up on an unmoderated tool for a participant to engage with whatever. We talk a lot about how to do that in a way that those are going to be effective. And by the end of the first workshop, everybody has a lightweight research plan. I give a template. So there’s a template that has all those elements in it. And there are a lot of tips and tools and sample. Questions and sample tasks. So it’s pretty plug and play, but the foundational understanding is there in terms of, you know, not introducing bias and some of those other elements.

    The second workshop is literally run a study. So when I was at Edmunds and we were doing in-person research, we would recruit a bunch of participants to come in and we’d have designers, PMs, engineers running their own interviews and, you know, we’d sit and give feedback often. Now what we do is all unmoderated. These workshops are all online now remote. So, you know, it’s an unmoderated tool and they set it up. They set up their study and the tool, and then we, you know, wait for the results to come in the videos or whatever.

    And then the third workshop is in researcher language synthesis. And that’s the, like, how do you go from all this data that you just got to actionable insights? So we look at, we talk about the data. We talk about findings that come from there. We talk about insights that are really most important. And then we talk about prioritizing those insights according to the business objective, back to the business objective, back to the decisions that need to be made. What are the most important of all of those insights? Cause you might get a lot of stuff, you know, out of even a lean study. What are the things you need to take action on? And then we talk about taking action. And I have a, again, there’s a template for summarizing their findings of their study, but there’s a table that shows like, what was the insight? Okay. It was high priority. So we’re going to take action. We’re going to do this thing. Who is going to do this thing? It’s assigned to a team. Who’s the point person. Maybe it’s the PM on the team. Maybe it’s the designer. By what date is this thing going to be done? So everybody on the team now has agreed beyond the person that ran the study. They go back to the team, have the conversation. Everybody has agreed. Here’s what we’re going to do as a result. And then that goes into that table goes into their summary. And then there’s a way to go back. If the person that’s running this study. Is not one of the people on that team in this case, they probably are because they’re the designer or the PM or whatever. But you can go back and see what was actually done. Was it, you know, when was it done? What impact was gained by that study. And you can then add the impact t your impact tracker.

    Then there’s the coaching that happens after the training. And that’s really vital to help people sear in the knowledge from the training and get feedback as they go along and ask questions. So sometimes the designer will go to the researcher who led the training and ask, will you take a look at my plan ’cause I’m going off the template a bit and I wanna make sure this makes sense. Or they have a question about synthesis because they get something that they didn’t anticipate and they wanna talk through how to do it. Or they need help figuring out how to message something to somebody that wasn’t on the team but needs to get these insights. So there are things that come up in life and sometimes it’s feedback on something that they’re doing. Like when I was at Edmonds and we were doing live interviews, we’d actually have a conversation after each set of interviews with the people that were running them and how did you think that went? And if we saw something that maybe they could benefit from, we would share that with them. So I think that’s a really important part of it and something I incorporate into the workshop that I do to train, it’s not just the training, it’s the follow-up coaching as well.

    Steve: I think there’s a lot of hand-wringing off and on over the years about the risks and the consequences of, what did you call them? Non-career researchers, that’s a great term.

    Carol: People who do research, I think is what people are saying now.

    Steve: You know, we talk about the consequences of these kinds of programs that allow non-career researchers or people who do research. If we empower them as we’re kind of sort of the gatekeepers of the skills and the knowledge to do research, which may not even be an accurate framing anyway, ’cause people are doing research anyway.

    Carol: Yeah.

    Steve: here’s sometimes some hand-wringing about unintended consequences or intended consequences. I don’t know, with these programs of these different organizations, were there longer term kinds of changes that you noticed?

    Carol: Yeah, I think it’s a good question. And I’ll just say, I don’t have that argument anymore with people. I have stopped trying to defend how this can work and how I’ve seen it work well, because the fact is, I’m just really realistic. First of all, I’ve seen it work well in this way that we talked about where there’s this sort of training set and then this sort of coaching activity, and there’s a conversation. It’s an ongoing conversation. And so what I’ve seen work well, one of the things that I’ve seen come out of that that’s been really beneficial is that people who have gone through this program tend to have a better sense, when we’ve been in-house, tend to have a better sense of how to work with research and have a better appreciation for the research that the researchers are doing, the career researchers are doing. And that partnership is richer. I have seen it go awry. I’ve seen people go through a few workshops, refuse the coaching, and then do things like put an app in front of consumers and say, “Do you like it?” So it’s not without risk. I totally get that.

    At the same time, I’ve stopped having that discussion with the researchers that are worried about the field being diluted, or I’ve stopped using the word democratize, ’cause we’re not democratizing. We’re helping people do stuff that, frankly, they’re already doing. So why wouldn’t we help them do it better? So I think what, and I see it now in the consultancy, I’m really focused, if I say that I’m focused on helping leaders in companies that maybe don’t have a research leader, or maybe they’ve got one or two researchers, or no researchers, and they’ve got all of these other people out having conversations with customers, why wouldn’t I want to help them do that in a way that it’s gonna be more effective, where they’ll get good data? Because we all know that if you just go out and put an app in front of somebody and say, “Do you like it?” You’re not gonna get, it used to be garbage in, garbage out, right? That language still applies decades later.

    So yes, there are risks. I know what the risks are. I think I named one of them anyway. People just go, “Well, why can’t I do persona research?” Or whatever, probably not the best example, but just helping them realize there are things that you need to know, and I get that you need to know those things, and you’re probably not gonna get what you’re looking for with this method. And so having those conversations, it doesn’t mean that once I leave, they’re not gonna try to do that anyway. I can’t control that. Even if I’m in the company, I can’t control that.

    So I think the risk of people who do research or non-career researchers doing this just without any guidance is greater than the risk of them thinking they can do something that they really need a career researcher for. And I think it’s not, this is not unrelated to, I mean, it’s a bit of a tangent, but it’s not unrelated to the thing that we see where companies think they want research and they hire someone. I’m seeing, I was seeing more of this like before the big sort of layoffs happened starting at the end of 2022, I guess. I was seeing more first researcher roles that were a player coach, kind of lead manager, which I think is great. I think that’s what I would advise clients to do if you’re gonna get one person, make sure they’re at that level.

    But I do still see companies hiring more junior people. And I know what they’re thinking. They’re thinking we need someone to do some research. So they’ll get someone who’s very smart and very well-trained in their research chops, but there may be a senior researcher or maybe more junior than that. And then they’re overwhelmed with, they don’t have a sense of the landscape or how to manage in that kind of an environment. They aren’t getting mentorship in their research work. And then there’s sort of a, like there can be at the company, it kind of a, well, that didn’t really work out. So we don’t need research. You know, there’s sort of, instead of the concept of research being seen, instead of research being seen as like a concept or a practice that kind of associated with a person. And then they go, we don’t need any researchers. We just need to do this ourselves. And so I feel like that has, I’ve seen that a bit.

    And I’ve seen, I mean, I’ve also been, some of the people that come to me for coaching are people in that situation because they’re researchers that are not getting mentorship and they’ve kind of been thrown into this situation where they just don’t have the experience to be able to manage all the pieces that go with it because it’s not just about running studies. And I think I totally get the excitement about being the first researcher, you know, and when someone wants you to play that role. And I mean, it’s, you know, there’s a lot of trust that goes into that. I also know people that are sort of senior researcher level, I’m just throwing these terms out. I mean, it’s all, you know, it just depends on the person, but who would say, and you know, they’re in a career search and we’re talking about their career search and they’re like, I don’t want to be the first person ’cause I know what’s involved in that. So, you know, I think it’s like, yeah, I get why someone would take that job, even if they maybe have like a couple of years of experience ’cause it’s exciting. And I also hear people are probably qualified, you know, who have been working for six or seven years. And again, those numbers are just, who knows, you know, it just depends on the person. And they’re like, I don’t want to do that ’cause I know how hard it is.

    Steve: We sort of shifted in this conversation a little bit to talking about your consultancy. What did you start and why?

    Carol: I had been, towards the end of my time at NerdWallet, I had been getting calls from coworkers asking for help to set up a research program. Like, how do I get started if I want to set up research? And, you know, I was just having these conversations and realizing that I was really excited about this topic and that it’s your beginning. The beginning point is really exciting to me, right? So when I left NerdWallet, I started looking at open roles at the time. And they were this, like I was saying, player/coach kind of role, right? And so it’s like you’re doing some of the bigger research while you’re setting up operations, while you’re setting up a roadmap, while you’re setting up, you know, all the infrastructure and everything. And I had already done that. I had done it a couple times. So I realized I wasn’t excited about doing that again.

    And what I was excited about was the leadership components of that. And so the coaching or advising, and we can talk about what I think the differences are there, but, you know, the sort of training, helping people become more self-sufficient, either leaders feel like they’re stronger at supporting a research practice, whether they have researchers or not. Again, like we were saying earlier, helping designers, PMs, you know, et cetera, feeling confident that they can collect insights. If they’re going to do it anyway, we may as well help them do it well. So those are the pieces that I realized I was more interested in. And also just having conversations with people about the importance of operations and thinking about research ops from the beginning or the middle, wherever you are, and how that can be such a force multiplier, you know, such a way to move forward more quickly by spending some time on infrastructure, tools, templates, like having some kind of process, knowing, you know, having some way for people to capture the insights that they’re collecting and share it in whatever way that, however that looks like things that are going to help you do things better and faster later.

    So those were the pieces that I was really interested in. And I decided to just go out on my own. I have, you know, I was out on my own for a while through, let’s see, like through most of the 2000s, that looked more like contract research work at that point. And I was doing that in parallel with other work that I was doing that was not tech. But at this time I was like, I’m going to go all in on this consulting model and see what happens. And that was like towards the end of 2022.

    Steve: Since you teased us with coaching versus advising, I’m going to ask you to take the bait. What do you think the difference is?

    Carol: I mean, I think, and this isn’t like, you know, genius. I think this is the way that a lot of people distinguish those. But to me, coaching is more, let me start with advising. Coaching is more like I’m working with the head of design or I’m working with somebody, you know, head of product or someone in that team that’s in a leadership role to help them see, you know, for themselves, like how that can, how research can be, have more impact or, you know, again, whether they have researchers or not. And so advising, I think has much more of a, like we’re in a conversation and I’m giving them ideas or tips.

    Coaching is more of a, I’m working with, I don’t do the big sort of life coaching or big picture career coaching. Like, should I do this anymore necessarily? Because I’m not like trained as a coach where I would do life coaching kind of thing. It’s more like, you know, somebody is in an ops role and wants to shift to a research role and they have all the training to do that, but people aren’t seeing them as a researcher. What do they need to do with their portfolio, their resume? How do they need to talk about the work? Somebody gets laid off, you know, it’s a surprise. They’re trying to prepare for their next role. Somebody is, like I said, in a role where they’re like the only researcher and they’re not getting the mentorship. They got feedback on a specific thing and they don’t really know how to work on it. And their manager isn’t really kind of maybe helping them figure it out. Like it’s a very specific engagement around a topic that we can say, here’s the end goal and here are the steps that you can go through to get to that end goal. And what are the milestones that we can look at along the way, even if it’s just like four weeks or six weeks.

    It’s a very specific set of things that we’re doing to get somebody to a particular place. Whereas advising is also there’s a set sort of, you know, sort of a set like arrangement, a number of sessions or whatever. But it’s more like me tossing out advice or ideas, maybe more than I would in a coaching model.

    Steve: I’m going to use a word you haven’t used, but when you talk about coaching, I think a little about facilitation. Whereas in the advising, you have a best practice or an idea or suggestion. In the coaching, you’re kind of working along the path to get this person to articulate specific goals, that kind of thing.

    Carol: It’s kind of like they are going to do the work to get to a certain place. And I am helping facilitate that. And it’s the way that I work with people in coaching, it’s like there’s actually a worksheet that we use. And the worksheet kind of starts with like, what, again, I sort of should distinguish I’m not doing this sort of big picture, like what is my life about, but I do start with like, what’s your mission statement as a researcher? And what is your broader goal over the next few years? And then what are you trying to get to in the next few months, whatever that timeframe is? And that’s a worksheet where it’s like, literally, what steps are you going to take to get there? What, you know, how are you going to know that you’ve achieved that step? So what milestones are we looking for? What does success look like? When are we going to say you’re done with that step and, you know, maybe addressing a different step?

    And so it’s not super linear like that, but it really is. It’s like a, you know, a template. And I found that that worked really well. Actually developed the template when I was at NerdWallet, because I found it worked really well for the team to help them think through either the broader, like, I want to get to be a manager. How do I do that kind of thing? Or the very specific, they got feedback on a performance review about something and over the next few months they want to work on it. And so it’s a really simple template and approach, but that’s how I keep the coaching engagements to like a particular goal that people are going for.

    Steve: So coaching engagements, advising engagements, what are the other ways in which you’re working for whomever?

    Carol: So I do workshops and I have one workshop that’s really targeted to researchers or, I mean, it could be anybody, but mostly the people who come are like lead researcher or managers or senior researchers or designers. It could be PMs as well, but that’s Prioritizing Research for impact. And you know, there’s a lot of conversation about impact. It’s really the thing that we have had to make sure that we’re measuring, right? It’s not about, and what is impact? We can talk about that in a minute, but the workshop is about how to think through how you’re going to get to impact. It’s not just run the studies that you want. It’s not just run the studies that somebody’s telling you they want. It’s like, what’s the business objective that we’re trying to achieve? What decisions are going to be made if we have this information for a particular study? What do we already know about this? And then we sort of go through this framework based on clarity, risk and cost. So what do we already know that’s clarity? What do we still need to know? What’s the risk of going forward without more research, any research? And what’s the cost of doing research? What’s the cost of developing this?

    And there’s a worksheet. It’s really a spreadsheet that we toss all of this information into and have the conversation about each of these possible research projects. And then at the end, you can see what’s high priority, what’s medium, what’s low priority. And then we also talk about how to involve, who do you involve in this prioritization process? How do you involve partners? And then when we get to the end, like who’s the ultimate decision maker for research? That may not be the person that, I mean, sometimes people come into the workshop and they’re like, well, the person who’s making the ultimate decisions, the person who should be really. So that’s a conversation to have. And then after the decisions have been made, what are some best practices to convey prioritization decisions? Transparency, you know, share the work, show people how you got to that decision. Hopefully they were either involved in the conversation up front or someone on their team was who has helped them understand the process. And so nobody is super surprised at the end, ideally.

    And then sharing out the results, like literally share the worksheet with everybody that needs to have it so they can see what decisions were made, which projects were prioritized against what other projects. And then for each of the, you know, if it’s sort of low priority and you’re not going to move forward, how do you communicate that? If it’s high priority, how do you communicate that? And then we end up with a lot of things that are sort of medium, like we need to do something, but we don’t need to do a fresh study. And so maybe that’s a researcher is going to go sit through what we already know, and that will save the team time by not doing fresh research because we already know a lot about it. So we have high clarity, but it is high risk to move forward without doing anything else, you know, and the cost to do this research, meaning like go through this stuff is pretty low relative to the cost of going through development and getting it wrong, which is pretty high. So pulling those levers in, you know, in the workshop, we go through this for like three research projects so people can actually do it by the end of the workshop, they’ve prioritized three projects, then they can take that back to their organization and use that tool, the worksheet.

    Yeah, it’s on Maven, which is a platform that has, it’s actually a really good platform for all kinds of workshops and leadership. There are workshops on AI now, there’s all kinds of stuff in there. So that’s the one that’s about prioritizing research for impact. I also have one that I literally call it see maximum impact from customer conversations. And that’s a it’s creating a game plan for the, it could you can call it your research program, you can call it your, you know, customer insights practice, you can, however you describe the thing you’re trying to do by having customer conversations. But the idea is that we do that one’s really tailored to like, leadership.

    So the people that come are usually like head of product, head of design, product ops, you know, UX leaders, whatever, it’s leadership role. It could be somebody who’s starting a research team who’s a researcher, and they haven’t done this before the kind of player coach person we’re talking about. But the idea is at the end of that workshop, we have a game plan. We do a gap analysis, what’s the current state of research, I’m just going to call it research shorthand, you know, what’s the ultimate desired state. And then let’s make a three month, very specific three month plan to get there. And we look at infrastructure, meaning tools, processes, training, whatever’s going on there, the operational pieces, we look at staff, that could mean you have a researcher, it could mean people doing research, it could mean there’s some operations person on another team that’s helping you recruit, could be anything. And then we organically in the conversation, we start to talk about the research roadmap, because people will come in and they’ll go, well, the most important thing we need to know is X. And so it’s not a workshop to lay out your whole research roadmap.

    But those pieces come in, the thing we ultimately need to know is this. Right now we need to know this other piece. So yeah, that’s also on Maven. I’ve run it internally within the company for, you know, a handful of leaders. And I’ve also started running it on Maven. The third workshop that I have right now is this training, you know, designers, PMs, content strategists, whoever, to do their own research. And that’s the thing that we talked about earlier, three parts, planning a study, executing a study, synthesizing to get to actionable insights, and then some coaching. And that one I’ve been running within companies, and I’m going to put it on Maven soon. It’s in the process of moving. I can still run it within a company, but it’s in the process of also becoming available on Maven.

    Steve: The one for leaders, the title has customer conversations, not research in it.

    Carol: What I’m finding, and I’m not the only one, I’ve been in conversation with a lot of people that are finding this. I mean, in this conversation, we’re talking about research, I’m using that language. But you know, my target audience really is like head of product, head of design. And so there can be, and I think in the last year and a half, become even more of a challenge with the word research in that audience that sometimes people think it means it’s going to be big, it’s going to be expensive, it’s going to take a lot of time. And yeah, sometimes it might be big, expensive and take a lot of time if what you need to know is foundationally something really important to your business, right? That you don’t know that’s going to, you know, make it or break it kind of thing, right?

    But I think a lot of what people need is not necessarily that. And I don’t want people, I don’t want those leaders to think that having conversations with customers needs to be big, expensive and take a lot of time. Of course, they’re doing research, you know. But like if you look on my website right now, the word “research” does not appear in until you scroll below the fold. And so I’m experimenting with the way to talk about the offerings that go beyond the word research, because I, unfortunately, I used to be much more of a purist, like many, many years ago earlier in my career. Well, people need to know that research can be lean. Yeah, people are going to figure out that research can be read because we’re going to do it. You know that I don’t need to be preaching about it and I don’t need to be stuck on using that language. I think one of the things that’s held us up in the past as a field is that we’ve been too attached to language process ideas that aren’t necessarily current anymore. And so I’m like, call it whatever you want, you know, like we’re going to do this thing and I think it’s going to help your business and I’m not attached to the word.

    Steve: When you started talking about developing this business for yourself, you kind of hinged on like what was exciting to you. And I’m wondering, you know, now that you’re kind of up and going, like do you find in a different experience for yourself when you are doing this, say, through Maven and it’s for the public, for lack of a better term, versus working with an organization and kind of going into that organization? Is there any differences for you when what you’re doing in those different kinds of venues?

    Carol: You know, I have this really deep background in teaching. And so for me, leading workshops is really fun and it’s really exciting and working within the organization can also be fun and exciting. It just, they are different and I enjoy both. Yeah, I mean, I like the kind of bringing people together from different organizations and seeing the kinds of experiences they bring in to the workshop and they get a lot of benefit out of that conversation. I mean, this is the feedback that I get, like not only was it valuable to get the, you know, the material and the worksheets and whatever insights I’m bringing and facilitation, but the experience that other people are bringing in from, you know, if we do this publicly is really valuable. And frankly, sometimes I have to really rein it in because they can just start going on and trying to solve each other’s stuff, you know, help each other solve things and, which is great and I love it when they, at the end, you know, people say, let’s connect on LinkedIn and keep the conversation going. I’m actually about to set up a way for people to keep the conversation going across cohorts. So that’s something that I’m going to be doing later this year as well, because there’s so much benefit that people find from checking in, you know? So yeah, it’s different and they’re both interesting to me for very different reasons.

    Steve: You had offered to give a little more definition about what impact meant. So I want to loop back to that.

    Carol: I’ve been looking at and following what other research leaders are saying about this too. And I think that one thing that we seem to all agree on is that impact goes beyond what I call product impact. So, you know, pretty obvious that impact means we do some research, we come up with some insights, you know, we take the most important of those and we do something to change the existing product or we move into an area that’s new and we see some kind of impact that we can measure in terms of, you know, lift in engagement or revenue or customer satisfaction or whatever the thing is that we’re measuring, right, from a business perspective. That’s one kind of impact, but there are other types. And so I think there are three things.

    One, product impact, like I just described, and organizational impact. And that’s stuff like what we were talking about earlier, seeing teams understand better how to work with career researchers by going through the process of learning how to do some research for themselves. I would call that organizational impact. Organizational impact is, you know, content strategist understands, you know, let’s cut that one. I’m going to stick with the first one. Operational impact is stuff like efficiency. So and this, again, relates back to something I said earlier, but we look at this, you know, we try to prioritize the most important and most impactful research. We look at something where we already have a lot of information. We have a lot of clarity about this problem, but maybe this team doesn’t know it.

    So for example, we sort of real example, there was a new team spun up around a very important initiative. So the product manager, the designer, and the content strategist were all new, but there was a researcher that had been doing research in that area. And so the product designer, content strategist, designer thought they needed to do a six-week sprint to uncover, you know, where they needed to go with this very important thing. And researcher knew that there was a lot of information already. Researchers spent, you know, something like half a day going through all the information that they had, sat down with this trio, shared the information with them in an hour. Researchers spent like four hours. We can calculate the cost of that time, saved this trio the first three weeks of the sprint. We can calculate the cost of the time that they would have spent and do math and say, we spent X dollars. We saved X dollars here. And they were able to go straight to concept testing because there was all this foundational work that had already been done. So that’s an example of operational efficiency. And I don’t know that we, there are people talking about this, some people talking about this, but I don’t know that we’ve spent as much time on those calculations as a field as I think we could.

    Steve: Are there impacts that are not measurable or not easily measurable but still kind of make your list?

    Carol: I’m sure there are. I think I’ve pulled my list down to three. I mean, if you look at some of the things that people have been writing about, there are like these much more detailed models. I think it goes back to what are you going to do with this impact? Like if we want to be able to go back to leadership team, or we want to be able to put, you know, at the end of a quarter on an OKR spreadsheet, like what our impact was, we need to make it digestible by other teams and leaders. And so I think we can, I feel having studied this for a while, that everything kind of rolls up to one of those three areas. So I haven’t found something that doesn’t roll up to those three areas, let me put it that way. And I think that if we keep it simple like that, we’re much more likely to be able to say we can see, you know, like we saved X dollars by not doing a bunch of extra research on this project. And that’s something that we can talk about very clearly. I think the related to this is that it can be hard to measure impact period. And we know that, you know, so if you’re a researcher who’s a shared resource across multiple teams, you work on one thing, you go off to work with another team, how are you going to know what team A did a month later, unless someone, you know, comes back and tells you, you may have to go back and ask, hey, what happened from that study? So you know how to describe the impact that you’re having.

    But we need to be making the effort to try and find out. I mean, it’s hard. As a consultant, it’s hard for me to know what the ultimate impact is of these workshops and the coaching and the advising unless people tell me. And I also know quite well from my teaching experience, sometimes people learn a thing and then it’s not until, you know, a while later that it actually kicks in for them. So I think that when we’re talking about training that doesn’t have a direct sort of relationship to work that’s happening right now, yeah, it’s hard for me to even know what impact I’m having. But I think it’s really, really important for us to continually try to make sure we can get as much as we can about that. Thank you. So, obviously, none of us knows the future, and we can’t talk about the future unless we talk about how we got where we are and where we are now, right? So I think I actually want to back up to a bit of like the difference between nine years ago and now, because I think it’s relevant to this.

    So when you first invited me to do this sort of redo, have this redo conversation, one of the prompts was what’s changed and my first thought was everything. And then I went back and listened to the original conversation from nine years ago, and I realized, oh, more than everything has changed. And nine years is a long time. So we would expect that there would be shifts. But aside from the obvious, like the pandemic, remote work, that kind of stuff, just listening back and thinking about the way I talked about the work then, the way we all were talking about the work then, and the way we talk about the work now, we’ve been talking about impact. We have not, I haven’t used the word qualitative research or design thinking. And the last conversation was all about that, because that’s where we were at that point in the industry. And so that was what was making the work successful then. But we were, if we look even further back, the internet, the history of the internet, right? I was at GeoCities in 1998. We were making it up as we went along. And I remember reading the IPO paperwork and it said, we have no idea how we’re going to make money from this thing. And so that was normal. And then we had the boom and we had the bust.

    And then, you know, so through like 2000s, everybody was talking about design thinking into maybe late 2010s. Now it’s all about impact. So the way that we characterize the work has really shifted. I think for me also, when I think future, being at this point in my career, I start asking myself, what is my legacy? Which sounds really fancy. It’s not like I think I’m capital L legacy, like I’m a celebrity or something, but I think we all kind of go, I’ve been doing this for a long time. Like, what am I going to leave this field? What am I contributing and what impact do I want to have now as I go along? And then what am I going to be leaving whenever I decide to stop this? So I kind of look at all of that and I go, where are we now? What does the future look like? Obviously AI. I mean, we don’t need to say much more about that. We need to figure out how that’s going to, how do we use AI tools and that’s changing every single day. How do we use those tools to help the work that we’re doing now?

    I mean, when people ask me, what do I need to do? I actually had a call like this yesterday, person got laid off. What do I need to be thinking about and what do I need to do to position myself for my next role? It’s like, you need to be studying AI tools. And like, if you haven’t already done that, like get jumped in. Right. So that’s one kind of really obvious thing. I think another thing that we’re seeing now that’s not going to go away, that’s going to be in the future is this idea of people who do research, right? Non-career researchers collecting some of their own insights. We have to just, we can’t stick our heads in the sand and say, make it go away. It’s not going away. It’s here. It’s been here for a while and we need to figure out how to jump on that. We need to be mixed methods researchers.

    You know, it’s funny because when I started, I came out of human factors school and that was very quantitatively focused. And then when I started working, I just started at an era when the work was very qualitatively focused. And so now we’re shifting back towards generalists. So I think everybody needs to be some kind of mixed methods researcher. And I think most people are going to end up being sort of T-shaped, like you’re very strong in some areas more than others, but I don’t think we can go out anymore and say, I only do ethnographic, deep qualitative research and I don’t know anything about writing a survey. Like I just don’t know that that’s going to be possible moving forward.

    And another area that I think is really important for us is to, for people who haven’t already been doing this, because some of us have been doing this for a while, but triangulating insights across different sources. So knowing how to dive a bit into analytics data, you know, understanding something about behavioral science, if you don’t already, you know, making friends with the people who run customer support. So you know what they’re hearing, like, do you have a market research function, you know, like all of these other insights functions that I personally think and have thought and have seen work really well, where we’re like, totally working together in a very collaborative way. You know, I think at a minimum, like knowing what they’re doing, if you’re not in an environment where their culture is that collaborative, but having some way to look at things across multiple types of insights functions.

    So this is a bit of a personal aside, but it’s very relevant to this question. So I went public in January with the fact that I had lung cancer late last year, and I decided to go public with it, because I thought it might be valuable to people. And I’ve gotten, I mean, you know, in terms of personally what that did for me, it’s, you know, it’s just sort of I could have gone through a full examination of my whole life and career. Oh, my God, do I want to keep doing this? And what I realized is, I’m happy making the contributions that I’m making, even though it’s hard to directly measure impact. I’m hearing from people that they’re finding value from the work that we’re doing together. And so that’s what I want to leave the world with. I want to leave people with this idea that the work that we’ve done together is valuable to them, whether it’s tomorrow, they’re taking the prioritization worksheet back to their company, or we have this coaching conversation and two years from now, they see value in it in some way that they couldn’t have anticipated. So I think that’s really vague and broad. But, you know, I’m trying to be as clear as I can about focusing in the areas where I think I can make the best contribution and have the most impact. And that what zaps my energy, what gives me energy, like you were talking about earlier, I really like teaching these public workshops, as well as doing the work internally. So I’m going to keep doing the public workshops. Yeah, just keep reexamining what, how do I feel about the work that I’m doing? And what am I getting back from people?

    Steve: I think you’re saying that talking about or going public with your medical situation prompted people to reach out to you in a way that highlighted the importance to you of the impact of the work that you’re doing. Is that correct?

    Carol: Yeah, it was one of the things that did that. I mean, and also just literally in terms of impact of that article. Many people have told me, oh, I went and got a checkup, because I realized I hadn’t been taking care of my health. Oh, I smoked like many years ago, I should go check that out. Or I hugged my child more closely, I called my parents, the human elements of it, as well as the physical health elements were that was really rewarding. And I don’t know what I expected. But I don’t know, for some reason, I didn’t. I don’t know why I didn’t necessarily expect all of that.

    Steve: Well, yeah, you have no template for, no prior in what the response to that is going to be.

    Carol: No template. I mean, just to throw this out there. And just as another, like, I didn’t write this in the article, but I didn’t even know how to tell the clients that I was working with. And I and that’s where I said, I’m going on sabbatical. And then people thought I was taking a fancy vacation. And then I said, Well, I’m taking a medical leave. And then they worried a lot and started slacking me. Are you okay? What’s going on? How are you? What do you even say when you need to take two months off or whatever it was, if you don’t want to disclose because I wasn’t ready to disclose that. So I don’t even have a template for that is what I’m saying.

    Steve: Yeah, now you’ve had that experience, so you’ve learned from that experience.

    Carol: Hopefully helped other people.

    Steve: We’ve been talking in and around impact at various levels and yet this article, the examples you just gave from your writing of maybe it’s outcomes, not impact. I don’t know. I don’t want to jargonize it, but the kinds of things that happened as a result of your action that you found meaningful and that people reported back that they found meaningful. I don’t want to take a personal experience and try to force map it into something professional, but I guess I’m just seeing echoes throughout our conversation.

    Someone saying I hugged my kid is very interesting. That was the action they took. That was something they shared with you, and that was something that had meaning for you as a result of it. When we started off talking about founding your consultancy and determining what you wanted to do for that, what kind of offerings you had, I was just struck by the fact that you used the filter of what excited you.

    Now we’ve been talking about changes and even looking ahead, present moment to “future.” I guess just maybe try to tie those things together. Are there things about the near future, the distant future, whatever time horizon we have for future, are there things about that with the work that you’re doing that excite you?

    Carol: The thing that I’m excited about for this year is to actually do more of the public workshops. And so I think I mentioned I’m going to roll out the research, you know, lean research for designers and PMs to be public. I’ve got some other ideas that I’m working on, like, you know, some of the pain points that I hear from customers are finding the right people finding the right participants for research, which is a lot easier and B2C than it is in B2B. But there are some things that we can talk about. That’s going to be a workshop being having more conversation around knowing when do you do this yourself? And when do you hire a career researcher? What are the operations that you need to put in place to have your conversations with customers be effective? Like, there are topics like that, that I’m exploring for either short workshops or longer ones because those are things that I’m hearing about. And I like that public forum. So I’m excited to be rolling those out later this year.

    Steve: Carol, it’s really great to have this chance nine years later and talk about what’s changed more than everything and the work that you have done and that are continuing to do.

    Carol: Yeah, thanks so much for including me.

    Steve: Thank you for taking the time. It’s great to chat with you.

    Carol: It’s been really fun.

    Steve: That’s it for today. I really appreciate you listening. Find Dollars to Donuts where podcasts are podcasted, or visit portigal.com/podcast for all of the episodes, complete with show notes and transcripts. Our theme music is by Bruce Todd.

    The post 41. Carol Rossi returns first appeared on Portigal Consulting.
    25 March 2024, 2:19 pm
  • 58 minutes 11 seconds
    40. Gregg Bernstein returns

    In this episode of Dollars to Donuts I welcome Gregg Bernstein back for a follow-up episode. He’s now Director of User Research at Hearst Magazines.

    The thing that I always come back to is that there is no one way to do research. And I also think there’s no one way to do research leadership. So often when I post a video or write something, it’s a knee-jerk reaction to something somebody else might have said that I feel like is going to discourage folks or paint this industry in a negative light. I don’t want to sound like a Pollyanna, but I love this field. I think it’s invaluable. I think more companies should have a research function. And so anything that I write is usually meant to show that there’s opportunity, there is value in this work. – Gregg Bernstein

    Show Links

    Help other people find Dollars to Donuts by leaving a review on Apple Podcasts.

    Transcript

    Steve Portigal: Welcome to Dollars to Donuts, the podcast where I talk with the people who lead user research in their organization.

    Today, I’m chatting with Gregg Bernstein, nine years after he first appeared on episode one of Dollars to Donuts. For context, here’s a tiny clip from that episode.

    Gregg Bernstein: And I’m a little disappointed that you didn’t start this interview off by saying this is two Jews talking about customer research.

    Steve: But before that, did you know that there’s a new edition of my classic book interviewing users? The modern day book tour seems to be in fact the podcast tour. And so recently I chatted with Mike Green for his Understanding Users podcast. Here’s part of my conversation with Mike.

    Mike Green: And you mentioned the world of work and how it’s changed. And the one thing that we haven’t touched on so far is obviously the pandemic and the COVID years, if I can call them that. I’m interested to get your sense of how that impacted user research as a discipline. So speaking for myself, obviously the work has continued and it’s continued at pace. But I can’t remember the last time I sat in somebody’s office or place of work or even their home and actually interviewed them face to face. Which, you know, some ways it speeds up research. You can get more done remotely. People are perhaps more relaxed if they’re sitting in their own homes on Zoom. But there’s a loss, I think. As a researcher, I find not being in the context of the individuals surrounded by what’s on their walls and what’s around them and the kind of movement of the environment. It’s harder in some ways to get the insights. But I’m interested to know kind of what’s your perspective on how the pandemic changed for good or ill, kind of what we do.

    Steve: I mean, 100 percent to everything that you just said about loss. I mean, that’s the word that I use. I mean, I don’t know that it’s permanent. I think the world of work is continuing to change as we’re sitting here on this day. It’s the beginning of the year where we’re talking. I haven’t seen 800 RTO articles, return to office articles. But it seems like, you know, there’s a constant discussion about that. And it’s interesting because like for sure the pandemic changed work. But it also triggered lots of bigger and more uncomfortable sort of discussions about power like bosses and property owners that, you know, have a stake in how work takes place and where it takes place. And worker power kind of pushing back on that. And depending on where you live and what industry you’re in, you’re going to see that more or less. So like I’m saying that remote research is being affected by these much larger shifts that I don’t have any sort of brilliance on. But I think the work continues to be in the middle of.

    So I have not sat with someone in their place of doing whatever it is that they’re doing and interviewed them. And I just said at the beginning of our conversation, embrace how other people see the world. Well, that’s the way to do it, right? You let go of your thing and go to their thing. And it is harder. And it’s harder for me, they’re clients. But for other people, it’s their colleagues. It’s harder for us as researchers to facilitate that, oh, kind of reaction that we’re going for. We want people to know that their assumptions are wrong. And you can get these really jaw on the floor moments that we work to facilitate. We work to create those, you know, uncover those narratives and have our teammates let go of their biases and their assumptions and their aspirations. And that’s hard to do without taking people out. It was not only what we got to do, which meant that we could connect with people. We could see stuff that we didn’t know we wanted to ask about. We could be uncomfortable. We could be forced as researchers. And then we could create, I think, effective experiences for other people to also make the work transformative. And that’s a big fancy word, but we’re all changed by doing this.

    Yeah, I really miss doing that. You know, I have peers that are like, oh, someone today was talking about some overseas trip they were doing to do field work. Like, I don’t even have to go to some exotic environment, like different than my own. I just would like to sit in an office or, you know, walk around a firehouse or something like that. So I think these things are going to continue to change. But I think there’s two fronts is what I’m trying to say here, right? What do we experience in the field, but also what do we experience with our collaboration and facilitation of the people we work with? And I think this also happens after the field work. If everything that we do takes place in a remote workspace and not, you know, more often is asynchronously, we’re also having fewer of those.

    I mean, I can think of just times where I’ve had like clients and colleagues and we’re off site. We’re spending several days in a room going through this stuff and trying to make sense of it and just having like life changing insights come up. And that is so grandiose, my language. I mean, when someone comes up with something that riffs off of something that someone else says, and you can just sort of like feel a bunch of ideas come into alignment. Like it’s a really powerful intellectual, creative moment. And I haven’t had that for a while since I’ve been working where my participants are in a Zoom room and my colleagues are, you know, before and after the research. And so I don’t know, personally, I’ve struggled with the work feeling a little more transactional. And I think that is sort of coincident with other pressures on the work of research.

    So I don’t know, I’m throwing everything into like a big, hairy, ugly ball of smooshed stuff together. And I think when you bring up like remote and pandemic, it like, oh yeah, there’s all these things that are kind of connected to that. And I don’t know how to tease them apart in a sensible way. I think I’m, you know, I’m being buffeted by those forces, I guess the way everybody else is. But yeah, I miss it. I think that’s my bottom line is exactly what you said. Like there’s a loss there. And I hope we can evolve to a point where it is a necessary part of what the researchers do, what the team does, and to kind of have those experiences, which are so inspirational.

    Again, that was me on Mike Green’s Understanding Users podcast. Check out the whole episode, and of course, pick up a copy or two of the second edition of Interviewing Users. To learn about my consulting work and the training that I offer to companies, visit portigal.com/services.

    Now, let’s go to my recent conversation with Gregg. He’s the Director of User Research at Hearst Magazines.

    Gregg: This is Gregg Bernstein, and you’re listening to Dollars to Donuts.

    Steve: Could we get a two Jews talk about research? You think you would do that?

    Gregg: Again, me from nine years ago, not the sharpest tool in the shed when it came to naming things. But sure, you’re listening to two middle-aged Jews talk about research. Everyone’s favorite podcast. [laughter]

    Steve: All right, well, what a way to begin. Thank you for that. When we talked nine years ago, you were working at Mailchimp, and I think you were the maybe second or third person I interviewed for this podcast, but you were the first episode that was published. So it’s really cool to have you back and talk about what’s changed for you, what kind of things you’ve learned. So thank you. Do you want to talk maybe about some of the different places that you’ve worked and maybe compare and contrast what work was like and what you’ve kind of seen in that intervening time?

    Gregg: Yeah, first of all, Steve, it’s a pleasure to be back on your podcast.

    Steve: Great. Thank you.

    Gregg: So when you and I first spoke nine years ago, I was the research manager at Mailchimp. And it was my first time as a research manager.

    Gregg: And I also, I don’t think I realized at the time just how unique the Mailchimp situation was for a researcher. And what I mean by that is, I had an almost unlimited budget to hire videographers to film our customers and make short films. We would create these artifacts of personas that we would hang up around our office, so everybody would learn. We had a CEO who was a designer before he was CEO, who understood the value of designing for people and knowing who those people are. So he supported research. He wanted us to make the best designs, which meant knowing our customers. And I was spoiled rotten. And I realized in subsequent jobs that that was not how most research roles are.

    And I think when I left Mailchimp, I joined Vox Media. And we went from being really precious about the deliverable of the research to being scrappier at Vox. And I don’t mean that we were precious. It’s not that we weren’t precious about how we did research in either organization. We were thorough. We made sure we spoke to the right people. We asked good questions. We did solid research. But I think the difference was we wouldn’t — at Vox, I spent much less time on a project. If maybe I spent a month on a project at Mailchimp, I would spend a week on it because we had a very long list of projects that needed research. We had a pretty — not aggressive, but we had a quick-paced cadence of work. And so I learned very quickly that I had to work faster. I didn’t have to spend as much time creating these amazing artifacts as long as I was answering the fundamental questions and putting them in Slack or even a very poorly formatted Google Doc. As long as people were learning from the research, that was great. That was the gold standard. Did we learn from this? Did we make good decisions from it? If yes, move on. So that was a huge change in how I thought about research.

    And it also made me, I think, a better research manager or leader because I realized budget is not commensurate to quality. You can do amazing research fast, scrappy, on a budget. You don’t need those unlimited resources. And that’s not to say I would love a bucket of money to be at my disposal. If I had to choose, I would take the high budget all the time. But I had to quickly learn how to get by with less, less time, less money. And you know what? It was a great experience, great learning opportunity. And something that I feel like made me a better researcher.

    Steve: What are the circumstances in which putting that effort into the deliverable is, I don’t know, necessary or appropriate? I think you’re listing the times when it’s not. There’s a big demand and people are willing to kind of consume it in the form that it comes and act on it.

    Gregg: Yeah, that’s a great distinction you’re making. At Mailchimp, I think it was necessary to put so much effort into the presentation of materials because it was a young company that was growing fast.

    And so, yes, we wanted people to learn from the research. But we also wanted people to understand who our users are. So if you’re an engineering manager, if you work in accounting, you still need to know who we’re serving every day. Like what is the reason we’re coming to work? And I think that that knowledge was maybe not distributed evenly. And so putting films together, creating posters, and making everything so public and investing in it sent a signal. Like you need to know who we are working for. Your job depends on it, directly or indirectly. And I think for that time in the company’s history, it was absolutely the right call. And just like at Vox when I joined, moving fast and just banging out study after study and saying this is what we need to know, okay, this is what we need to know. And saying, okay, now we know this. Let’s build a thing and move on. That was the right approach for where Vox was when I was working there.

    Steve: So that’s a little about Vox and kind of the change in culture and already a big impact in your approach that was suited to how you all worked and the people you needed to have impact. What was the next sort of major role where you, maybe your practice evolved yet again?

    Gregg: I think I’m going to stay in Vox because I feel like my time there — I spent four years there. And my first two years I was working on a team that was creating tools for all of our writers and editors. It was a content management system. And that was very similar to the work I was doing at Mailchimp, which was software for creating content and publishing it. At Mailchimp it was newsletters. At Vox it was content, news content. Or food content if it was for Eater. Or tech content for The Verge, to name a few of the brands we worked on.

    But at the heart of it, it was how do I understand the editorial process? And how can we make a better set of tools for publishing content, whether it’s an article or a map or a video or a podcast? And that was the first time I had worked on an internal team. So recruiting was no longer difficult. I could just get in Slack and talk to anybody in the company and say, hey, I’d like to talk to you about how you write articles. You know, there was very little difficulty in setting up — in finding participants and setting that up. That was the first two years of my time there.

    The second two years, my role — the mandate for my role changed from understanding how we create content to how do people discover and consume content? And, again, this was in a remote-based organization that was a little scrappier. So I really had to think about how do we build out a process of getting feedback from our hundreds of millions of visitors to our various websites? How do I work with not just my product organization but our editorial organization to understand what information would be valuable to them? How do I get support from executives to do this research to make sure that once it’s done, they will have an appetite for it and learn from it?

    And so it was the first time I had to create, I guess, demand and awareness and opportunities where none existed. Because it already existed within my product organization to build the content management system. But as far as, like, doing research that would support discovery and consumption, it was research that ended up supporting marketing and sales. Because we could — if we knew more about our audiences and what they valued and what they came for, we could put ads on our pages that kind of aligned with who was coming to our sites.

    And that’s not to say we didn’t have demographic data, but we didn’t really have an understanding of why is somebody coming to The Verge? What is the next action that they’re going to take after they come to The Verge? And how can we make a better experience for them? So if they’re researching headphones and they’re looking for product reviews, if we know that that’s what they’re coming for and we know that they spend a certain amount of money after the fact, we can sell ads against that. And we have a better understanding of, okay, people are — they trust us for our ads — I’m sorry, they trust us for our product reviews. We should probably think about a better product review experience. None of this really was, I guess, designed. We didn’t have a designed research process. And so for the first time, I was having to chart a new path with the support of my manager and my colleagues, but I kind of had to figure out how to make this happen within a large organization and figure out which people I needed to talk to, who I needed to get support from, who I needed buy-in from.

    And it was a fantastic learning experience because there was friction. Not a lot of friction, but like I had to convince some people of why we were doing this. I had to figure out if somebody was resistant, how can I get them to support this? And then how can I ask questions that will lead to insights that don’t just benefit me, but other parts of the organization? And so I feel like that was the moment when I really understood how to — I don’t want to say lead a research function, but how to get support and buy-in for research activities where maybe that didn’t exist before. And that’s what set me up for future research leadership opportunities. I feel like that’s when the training wheels came off and I understood the bigger job of being a research leader.

    Steve: Support sometimes comes out as people being blocked from doing research. And so I think you’re talking about you had your own team, your own manager, your own team that’s doing your research, but you’re trying to make connections and help people see and engage so that the work that you do is valuable and they’re gonna act on it. Am I getting it right?

    Gregg: Yeah, you’re getting it totally right. So one example is on the website Eater, which is about restaurants and food culture, there is something called a map — well, it is a map, but there’s a product name for it, which is escaping me now. But you might have like the 20 hottest restaurants in New York City. Or, you know, the 10 best restaurants that you should go to in Minnesota or Minneapolis, to be more specific. And a project might be, okay, let’s make the process of building maps better. But at the same time, let’s look externally to how do people actually use these maps to understand how can we improve the user experience. So we’re trying to make a better editorial experience as well as a better user experience.

    So part of that is understanding, well, why does somebody use one of these maps in the first place? What is their goal? And to do that, we might need to get support from the editorial staff at Eater, which means working with their editor-in-chief or, you know, one of the editorial directors and saying, hey, we want to put a banner on Eater that says, help us improve Eater for everyone. We need to get their buy-in so they’re not going to their website and wondering why is there a banner on the top of my page. So just taking people away from the articles that we’re publishing and pushing them to a survey or a screener to participate in an interview or usability study. So we need to get their support.

    But to do that, we also need to offer some sort of carrot. Like we want to talk to people about their Eater maps experience. But while we’re talking to them, is there anything that you’re curious about? If you had an Eater reader sitting next to you, what would be on your mind? So I’m trying to throw in questions that will help my editorial colleagues, but I’m also focusing on what I need to know to improve the map experience for my team. So that’s where I need to get their buy-in and their support. And it means clearly explaining what we’re trying to do, but also saying this is also an opportunity for you to learn about your audience. And this way I’ve got their buy-in. There’s no surprises when they see some sort of banner or call to action to participate in research. And they know that they’re going to learn something. And at the same time, our product organization is going to know something and learn something. Did that make sense?

    Steve: That’s a great clarification. Do you have any examples of overcoming hesitancy or uncertainty in those folks that you were needing their support from?

    Gregg: The hesitancy is usually just around, let me understand what this is going to look like. So showing an example of this is what a banner might look like on your website for a mobile user or a desktop user. So that there’s not — there’s this fear that maybe there’s going to be a screen takeover. And it’ll say like, don’t read this article, click here and take a survey, which we’re not trying to create a bad user experience. So it’s, I guess, demystifying the research process and showing, hey, this is what we’re going for. This is the goal of the study. This is how we’re actually — this is what it’s actually going to look like on your website. And we’ll work with you on the language we use, you know, help us improve Eater or make Maps better for everyone.

    Thinking further, like sometimes we would do these big studies of our audiences where we would need the editor-in-chief of one of our sites to write a call to action. Like, hi, I’m Nilay Patel, I’m the editor-in-chief of The Verge. We’re doing an annual survey to help us improve our site, not just the coverage that we’re writing, but also the user experience of visiting our website. And we need your help. So if you read our content or listen to our podcasts, help us out. So it’s always a matter of over-explaining and saying this is exactly what we’re going for. This is an opportunity for you to learn as well. Let’s work together. And this way we’re all going to learn something. And worst case, we get a bunch of responses that maybe they’re not exactly what we wanted to hear, but we’re still going to learn from real humans who read our content or listen to our content or watch our content. And we’ll be able to learn from it.

    Steve: When you started off describing these last two years, the train of wheels came off, and you ended and I didn’t really pick up on it. And I went back to the earlier stuff, but you ended with saying something to the effect that this was really where you learned about design research leadership. Does that take us into the next role?

    Gregg: I think it does because I joined Condé Nast as their research lead. Condé Nast is another publishing company. And the job I interviewed for was to be research lead for just their subscription brands, which are brands like The New Yorker, Bon Appetit, Wired. But shortly after I joined, in talking to my boss, the vice president of product design, we realized that I was the highest ranking researcher. And so if we were to have a holistic research process for the entire product design organization, we couldn’t just focus on subscriptions and subscription products. We needed to have user research across the board, across all of our brands and divisions. And I was able to articulate that this is what the role should be. You need to have somebody who is looking at subscription products, but the other parts of the company, like commerce, which means selling products through product reviews, which is something that Vogue does. Or some of the other fashion magazines where you’re not just selling a subscription to a magazine. The magazine makes money by reviewing products and saying, like, here are the 10 best bags to wear or backpacks or high heels or computers. The company makes money through that type of content.

    And so going back to what I was saying, I was able to articulate that we should have research in subscriptions, but also in commerce. But also having one research leader in charge of all research means that we can instill quality control. We can make sure that the researchers are collaborating so that a researcher who’s looking at commerce and a researcher looking at subscriptions, they’re not working in a vacuum. We’re not a siloed organization. We’re one research team that can collaborate or maybe move people around as needed based on what are the most important questions of the day. So I was able to see how getting buy in, putting processes in place, managing a team, how I could take what I had done at Vox and apply it to Condé Nast and kind of create a larger role for me at Condé Nast that was really necessary to make sure that the research was holistic and that the researchers were collaborating and that insights from one part of the organization were making it to other parts.

    Steve: You’re describing this point at which you go from having pockets of research, for example, to building a role that’s a leadership role where there’s a person responsible for taking care of and ensuring all those kind of qualities of research that you described. And that sounds like a point of evolution, a point of transition in the overall organization’s research maturity.

    Gregg: It was because it was a point where I was able to work with my manager to look at the entire organization, all the content that we publish, and see where the gaps were in our knowledge. So I had a researcher who was embedded with The New Yorker. I had a researcher embedded with Vogue. That left something like 25 other magazines that there wasn’t research, at least not user research. And so I was able to make the case that we really should hire somebody to look at all commerce. How do we sell products? What would make for a better commerce experience for our users? I was able to make the case that we should have somebody who is looking at our subscription brands like Wired and Bon Appetit. And then make the case that we should have somebody who’s just looking at the member journey because when you have so many different titles and so many different ways people are selling content, it takes some effort to know what’s going to resonate with somebody who they’re either thinking of buying content for themselves or to buy a gift for somebody else. Do they want a digital subscription? Do they want to actually receive something in the mail? So working with my manager and with the other design leaders, it became clear exactly where we needed to have resources in order to make sure we’re learning and supporting the designers and the product managers and the engineers to build the right product for the right people. So it was an inflection point. And it was personally great because I got to hire some really awesome researchers to fill those roles.

    Steve: What are some of the ingredients or elements that you are uncovering and articulating when you are making the case for those kinds of structural changes or role changes? What does that include?

    Gregg: I mean, first there’s pointing out that maybe we have a number of designers and engineers and product people working on a product with little to no contact with the humans who use that product. So just pointing that out and saying there’s an imbalance here in staffing. Or pointing out that a lot of designers and product managers are asking for research but not getting it because there isn’t the headcount or enough hours in the day to support those efforts. So those are usually the two places to start.

    There’s a demand or there’s an imbalance and a vacuum of user contact. I also have used interns as a way to gauge demand for research. So if I bring in a summer intern and I put them on a project with a team and then the intern goes away, the team will suddenly realize that void in their life where a user researcher used to be. So then you can make the case, hey, this team got used to working with a researcher. It’s really not ideal for them to go back to trying to do research on their own. If we were to open headcount, this is where researchers should sit as a backfill for the intern that we lost. So that’s something I’ve done that at Mailchimp, I’ve done it at Vox. It’s a good tactic to test the waters and build demand for hiring a permanent researcher.

    Steve: Yeah, that’s kind of brilliant. It’s almost like a prototyping process.

    Gregg: I’ve also seen it where the intern did a great job, but maybe after they left, there wasn’t as much demand as we might have guessed. And while I always love to make the case that I want to hire more people, sometimes that proves that maybe that wasn’t the right place to hire. So it is like a prototyping process.

    Steve: I’m curious if you have any perspective on Condé Nast culture in terms of how work was being done, about how you were engaging with different stakeholders or anything about research that is a compare and contrast with the first two companies we talked about.

    Gregg: I think what I can say about Condé Nast is it was the largest company I had worked for at that point in my career. So culture, I realized, is not set for an organization. Culture is maybe at the team level. So that was a, I don’t want to say a shock, but it was very different where you realize that other teams have very different ways of working, of communicating, of supporting each other. And so I feel like I was able to instill a really strong culture for my research team. I feel like, you know, among my design manager peers, we had a really nice relationship, but I would not want to generalize the culture based on just the people I was working with. It was large and, you know, your mileage might vary depending on who you spoke to on any given day. I’m trying to be diplomatic, Steve.

    Steve: I like hearing you how you’re unpacking it because, yeah, culture is this big label we kind of stamp on things. This organization is this culture, these type of people have this culture. But it is more local than global.

    Gregg: I would say that I was able to create this very supportive, warm, amazing culture. And I mean, partly it was we would get together, you know, every three to six months. So you would get to have human contact, you know, real life contact with people. But somehow I don’t even, if I could replicate it, I would. Even remotely, there was such a feeling of these folks have my back and I have theirs and I would do anything for these people. And that made its way into how we hired. Like, I don’t want to bring somebody in who is going to ruin the feeling of this organization. So let’s be really rigorous in how we hire. Not that we weren’t rigorous elsewhere, but it takes a special set of skills to communicate warmth and empathy remotely in Slack messaging, over a Zoom. And that’s something that I really cherished and something that I’ve been really mindful of ever since.

    Thinking about culture is a good way to transition to where I am now. I joined Hearst Magazines, yet another publishing company, in January of 2023. So I’ve been there for a year and two months. And what stuck out immediately is the warmth of every single person I’ve spoken to or spoke to in the interview process. And since I’ve joined, it’s such a warm organization, which is a massive organization. Hearst is huge. It’s 130 some odd years old. But from our legal team to our president to our executive leadership, everyone is just, they seem to care. And that’s what stood out to me from the moment I started speaking to the people at Hearst to people I’m still meeting. I mean, it’s a giant organization. I’m still meeting new people a year and two months into this job.

    But culturally, I feel it’s the closest I’ve felt to what I had at Vox, where I know that the team cares about each other. They care about the work. They’re invested in making a great employee experience, and they really want to make a great user experience. And when you can find people who care, not just about the work, but the people they’re doing the work with, it’s special. And I have a great set of colleagues, and I just, I feel like I’m in a great spot, which means you know that in like two months, people listen to this podcast and realize we have layoffs or something, and I’m no longer there now that I’ve jinxed it. I’m kidding. I’m kidding. It’s a great place, but it is weird because it is such a huge organization with so many tentacles, and I’m not quite sure how the company has managed to achieve it, but it’s a pretty special place.

    Steve: As you talk about culture, I hear this attribute of, my words, not yours, like welcoming to humans. And I think a theme of this podcast and part of this conversation is culture that is welcoming to research. And I like what you’re kind of getting at, that people care about each other and they care about the product and the experience that they’re making. I’m paraphrasing you badly here.

    Gregg: I think that comes from company leadership, and I realize this isn’t going to be the same across the board, but having a president of the company who says, “I really want us to know our users.” And to have the highest ranking executive in the company say that, it gets buy-in, and it makes everyone realize this is important, and it just makes embracing research that much easier to achieve.

    And so that’s the reason my job opened at Hearst was our president saying we need to know our users and the company investing in a user research function. And it also means that everybody I work with is curious to know how to incorporate user research into their processes. And so for the last year, process is what I have been focusing on because we have the mandate, we have the buy-in, okay, now we need to put the pieces in place. And for me, the pressure is on to deliver because it’s different than other jobs where research had existed in Condé Nast. Research was something that we were doing at Mailchimp. At Hearst, there wasn’t user research at scale when I joined, which meant we had the mandate, but we didn’t have the ability to do it or do it well.

    So I’ve spent the last year in many meetings with our legal team just to put a process in place to get consent from people who visit our websites to engage them in research activities. I’ve been having a lot of meetings with our tech team on where PII will be stored, which of our products should we use to even do research that will be secure, where recordings will not end up in somebody’s hard drive that they shouldn’t end up in, or in a cloud service where maybe it’s not locked down to our preferences. So this has also been a learning experience for me because I have spent so much time doing operations work just to make research possible. And it’s also been a little bit stressful because everybody wants research and I’m constantly having to say, let’s hold up because we don’t have all the pieces in place yet. We can’t put an intercept on our website. We can’t email a user because we shouldn’t have their PII in our individual Outlook or Gmail accounts. We need to use the right tools to engage with them that is secure, where we’re not just going to be leaking email addresses and phone numbers in the wrong places. So let’s really get buttoned up and dial this in so that we are protecting our participants, but we’re also protecting the company. And we’re not putting the entire notion of user research at this company at risk because we’re making mistakes.

    Steve: At what point did you, when you sort of started on this journey of yours to build this scale, did you do so with the expectation that operations was gonna be a key order of business for you?

    Gregg: I did not. I also thought maybe this was me coming in with a little too much confidence. I thought that because I had created consent forms in the past, I could just email my legal team and say, hey, we’re going to start doing user research. I’m going to put an intercept or a call to action on our websites. Here’s the consent form I created in Google Forms. And immediately my legal team said, timeout, why don’t we talk through this? And it was a setback because I thought we were ready to go, my second week on the job. And it turned out that we were many, many, many months away from actually being able to do anything at all outside of maybe a platform where we’re not using our users. We used platforms like User Testing where we could research with a panel of random people. But as far as engaging with our known users, that took a lot of logistics. But it was also a great learning experience. And I have some amazing legal colleagues who were really helpful in pointing out ways that things could go wrong and working with me to come up with a process that we’re all happy with to some degree or, you know, to the most part.

    Steve: Is there anything about your industry or the culture that even though you had this support and this collaboration, is there anything that might have led to the amount of the scale of the effort that you’re describing to get there?

    Gregg: I don’t know if it’s media or just legacy enterprise, you know, historic organizations. Because Vox was a media company, but it was a new media company. It started in the digital age. There was never print magazines. And so it very much operated like a startup where, you know, if there was budget and I could get my manager’s approval, we would just buy a product with a credit card. Here at Hearst, that is not how things work. Like we’re not just going to click through an agreement and agree to some random SaaS company’s terms and, you know, suddenly we’re using their product. Everything has to be examined and negotiated and approved. So things move slowly. I think that might just be because it is such an organization — such an old organization that wants to be around for another 100, 200 years. So the mindset is let’s be slow but sure. You know, it’s better to take our time rather than get sued for a million dollars because we violated somebody’s privacy or we, you know, we used a product that we shouldn’t have been using. So I think that’s the whole idea of the community thing.

    Steve: Well, I love hearing you describe slow in a way that is like deliberative and collaborative. I think, you know, there’s sort of an archetype of, oh, I want to get this thing done, but I couldn’t get, I couldn’t get anyone to help me or legal drag their heels that, but you’re, and maybe you’re being diplomatic, but I guess that the perspective I’m getting from you is that it’s not resistance to overcome. It’s the natural, you know, culturally appropriate way to do things, which is, which does take time. And another company might be faster, another company might be slower, but for different reasons, less, you know, more passive resistance. And here you’ve got slow, careful support, which is an interesting kind of way to have it be.

    Gregg: Yeah. It’s never no, we’re not going to do that. It’s yes, we could do that, but let’s think through every step of this process to make sure that we’re not overlooking something fundamental. So this will sound maybe tedious to people listening to this podcast. I apologize in advance. But if you think about a generic news website, okay, you go to an article. Let’s say there’s a call to action. Like you’re looking at a recipe on your favorite cooking website. We want to improve our recipes. If you have three minutes to spare to answer three questions, click here to take a survey. Okay. What survey tool are we going to use that we have an enterprise agreement with where we know that all of the data that’s collected is collected in a way where we know it’s secure? Because we have a license that we negotiated where we know exactly where the data is stored and who can access it and who has liability if there is some type of data leak. Okay. So there’s the survey tool. Okay. Maybe we want people who took the survey to opt into a follow-up interview. So we can add that question. Are you interested in joining our recipe feedback panel? If so, click here. And, you know, it takes you to a page where you can add your name and your email address. Okay. Where is that going to be stored? Who’s going to have access to it?

    And I realize, like, this is not — these are not new challenges. But my simple ask of we want to do a study led my legal team to work with me to say, okay, then what happens, then what happens, then what happens? Because in previous organizations, I would just be scrappy and say, yeah, they’ll pull out a Google form, it’ll go to a spreadsheet, and then I’ll email them and I’ll send them a link to my Calendly and they’ll schedule a time. And now it’s, no, we don’t have Calendly here. You can’t use that. So what else could we use? We don’t use Gmail or Google Calendar, but we do use this other product. What are other ways that we could create an inbox and create a link to a calendar? We don’t have an enterprise license with Zoom, but we have this other thing.

    So it’s really just looking at the menu of possibilities and picking the least bad options. Ideally, the better options, it would create a better user experience. But making sure that from initial contact to when we promise to expunge data, no stone is left unturned and we can account for every step of that process and know exactly what’s happening. And again, I know other people do this all the time, but for me, it was a learning experience to go from the scrappy or the let’s just throw money at this way of doing it to, okay, we really need to be buttoned up because a lawsuit is the worst possible outcome here. And I don’t want that to happen. I don’t want the company to lose money. I don’t want to ruin a reputation. So let’s make sure that whatever we’re doing is rock solid and is durable so that once we put it in place, anybody can do it and everybody can do research going forward.

    Steve: Do you have a sense in your year and two months, what’s the progress indicator for you about building these processes, building these kinds of operations and infrastructure?

    Gregg: I won’t claim that we have it perfect yet because it still takes a while to get an intercept on our sites just because there’s a lot of people to go through and there’s some engineering lift. It’s not just flip a switch. So I would say that doing it is not easy, but there is now a process that we can follow to do that type of research. I think the better marker of success is I’ve been able to open headcount because even with the technical ability to do research, research is still not anyone’s primary responsibility except for me and my team. So product managers are managing product. I don’t always have time to do research nor give it their 100% of their brain. Same with product designers. But because there is such demand for research, I was able to open headcount. And I think that’s the real sign that we’re making progress. Everyone wants to make better decisions, and the company has put the money into hiring humans to help us make better decisions.

    Steve: I love that. Let’s switch topics a little bit as you brought us, I think, up to date and even where you’ve been successful and where it’s taking you in this current organization. Let’s talk about your book. Research Practice, Perspectives from UX Researchers in a Changing Field.

    Gregg: Yay.

    Gregg: That is my book, Steve. Yes, I published this book in January of 2021, so we’re now three years out from it somehow. I suddenly had a lot of time on my hands during the first year of the pandemic to publish this. But this was a book that — maybe this — I’m assuming this happens to you. So what I found is I would write a blog post or I would give a talk, but the thing that people always wanted to ask me about was how do I get a job as a user researcher coming from academia, from being a psychologist, from being a marketer? How do I make myself attractive for a user research job? What do I need to do? I’m a team of one. I don’t know how to make the case that I shouldn’t be a team of one anymore. What do I do? I’m so lonely. I also don’t have any mentorship. Or I’m a new manager at Help. Nobody knows what to tell me on how to actually be a manager in a research team. How do I make the case for more headcount? How do I manage a team? How do I hire?

    Those were the questions that came in constantly. And I had my stock answers that I would give. I had articles I would point people to. I would try to anticipate what people were going to ask and write blog posts about it. But the questions kept coming. So I thought what if I created a book that just talked about what a career in UX research might look like? And I realized very quickly that I am not the person to write the all-encompassing guide to a UX research career. Because at that point, I had been a designer who transitioned into UX research. I had worked at Mailchimp. And I had worked at Vox Media. That is a small sample size to talk about all the places a UX research career might go.

    On the suggestion of a very smart friend of mine named Sian Townsend, she suggested why not ask other people to contribute their own perspectives and open source this? Which was such a great idea. So the book that I had started to write about what a career in UX research might look like became a collaborative effort to get multiple perspectives on a UX research career journey. From getting the job to the challenges of the job to where you might go next. And it was a fun project to work on. I wrote my own essays. I solicited essays from many other research leaders. I worked with an amazing editor named Nicole Fenton. Nicole edited Abby Covert’s book on information architecture, which is one of my favorite books on product and research and just thinking about information. So I sought out Nicole. Nicole helped make this book fantastic, in my opinion. And it was a really good exercise in publishing content, project management, and working with a host of other research leaders to create something that would be a good and evergreen artifact for the research community.

    Steve: When you’re in that role and you get these different perspectives, are there situations where you don’t agree with the guidance that’s coming in one of these essays?

    Gregg: No, it might not be what I would do personally, but I also, I mean, I wasn’t trained as a researcher, first of all. So if somebody is going to talk about a rigorous quantitative research approach, then who am I to say that’s not the way I would do it? And that was the whole point of the book was you’re going to get conflicting opinions. You’re going to get different perspectives. And I guess maybe the biggest takeaway is there’s no one way to do UX research, which is also the name of a blog post I wrote last summer, because we were also seeing a lot of comments on the state of the research industry. But the more I talked to research leaders and from talking to them about this book, there really isn’t one way to do research, nor one path for UX researchers. So I wanted this book to have differing opinions and perspectives, whether I agreed with all of them or not.

    Steve: You talk about evergreen, and yet Changing Field is in the title. So what does the book look like to you now, kind of three years after it’s out? Maybe that’s what you’re kind of getting at, you’re having these conversations with people later than when you wrote the book about what the world is now.

    Gregg: And I think, you know, when I think about how I would tackle the book today or what I think would be different or what I’m hearing from other researchers I speak to, there is such a drive to prove the value of research and make sure that it’s worth the economic investment in hiring a research team or a research person. And I don’t love that mindset that we have to always be proving our worth. But I think that’s a theme that comes up in the conversations I have, and it’s something I would expect to — there would be more content about that if I were to publish the book again. I also think that the financial realities of today mean that people are working leaner. They don’t have as big a research budget as maybe they once had. Teams — we all saw there’s been layoffs over the last two years. Teams are smaller. They’re sacrificing headcount or being forced to sacrifice headcount. So I think teams are also seeing that they have to get by with less, less people, less tools. So I think we have more constraints and more expectations to make the investment worth it.

    Steve: I mean, you made the point that the book is lots of people’s perspectives, but you are continuing to share your own expertise and guidance in, I see you on all the online things with blog posts and videos and so on. What are you focusing on in the questions that you’re trying to answer yourself?

    Gregg: I think the thing that I always come back to is that there is no one way to do research. And I also think there’s no one way to do research leadership. So often when I post a video or write something, it’s a knee-jerk reaction to something somebody else might have said that I feel like is going to discourage folks or paint this industry in a negative light. I don’t know if that’s the right way to phrase it. But I often want — I don’t want to sound like a Pollyanna, but I love this field. I think it’s invaluable. I think more companies should have a research function. And so anything that I write is usually meant to show that there’s opportunity, there is value in this work, and make sure that the folks who are curious about UX research maybe aren’t being sold a jaded or maybe geographically focused perspective. I think I just want to provide balance. Whether that’s coming through or not, I’m not sure, but that’s usually what prompts me is I see something and I think I don’t know if that quite captures it. I don’t know if that’s the whole story. I wonder if I have something I can say to add a different perspective.

    Steve: So in the notes to this episode, we can point people to the book, but where are you writing or creating other kinds of information and guidance for people?

    Gregg: If you go to my website, gregg.io, that is my blog, which I was on a roll last year when I was, it was funny, like as I was doing all the heavy lifting of putting processes in place, I was so motivated to create content and share what I was doing. So I had this streak last year of a lot of blog posts. I’ve kind of tailed off as I’ve gotten busier. But you can go to my website and I post content there. There’s also a newsletter you can sign up for, which just takes the blog posts and sends them to your inbox. I also post on LinkedIn from time to time. Those are pretty much the main places right now. I’ve created a number of videos for the Learners app, but that too has kind of tailed off as my day job has demanded more of my time.

    Steve: And do you think another book is in your future?

    Gregg: I don’t know if a book is in my future, but I do think that there is an update of sorts that should happen. Like I said, there’s been layoffs. People are tightening their belts and spending less on research. So I’d be curious to talk to research leaders. Although you’re already doing that, so maybe I’ll just feed you some questions to ask people. But I do think there should be an update of sorts. I just don’t know if the book is the right vehicle for that.

    Steve: So you want to be like a Daily Show correspondent, right?

    Gregg: I would take that job in addition to my current job.

    Steve: And by Daily Show correspondent, I meant for this podcast.

    Gregg: Exactly. Just put me on spot assignments and let me help, Steve. Not that you need it.

    Steve: Okay. All right, all right. The syndicated media network that is giving birth to right here, everybody. This is the moment. And Gregg, what shall we brand this larger effort?

    Gregg: I’ve already been thinking about the larger extended universe. So the book is called Research Practice. My newsletter is Research Practicing. So maybe it’s Research Perfecting. Maybe it’s Research Repractice. Again, I’m terrible with names. So let’s not tie me to any of these terrible names I just threw out.

    Steve: All right, well, Gregg is giggling politely, and so that might be the sign that we’re kind of coming to the end of our conversation. Any last thoughts for this time together, Gregg, or anything to kind of throw in there?

    Gregg: No, Steve. I just want to say, you had me as a guest nine years ago, which at the time I thought it doesn’t get better than this. And I’m fortunate to have developed a relationship with you where you provided so much good advice and an astounding board. And so to come back nine years later and do another episode with you, I’m thankful and I’m excited. So thank you for having me.

    Steve: Thank you very much for taking the time. It was really great to get your perspective, and I think people are going to learn a lot from hearing you today. So thank you.

    Gregg: Thanks Steve

    Steve: Yes, there we go. Thanks for listening. Tell your friends, tell your enemies about Dollars to Donuts. Give us a review on Apple Podcasts or any place that reviews podcasts. Find Dollars to Donuts in all the places that have all the things. Or visit portugal.com/podcast for all of the episodes, complete with show notes and transcripts. Our theme music is by Bruce Todd.

    The post 40. Gregg Bernstein returns first appeared on Portigal Consulting.
    18 March 2024, 6:23 pm
  • 54 minutes 21 seconds
    39. Mani Pande of Cisco Meraki

    This episode of Dollars to Donuts features my interview with Mani Pande, Director and Head of Research at Cisco Meraki.

    We used to do these immersion events where we would bring everybody who worked on, who was our stakeholder, to come and listen and talk to our customers. And we would do these focus groups, they were like a whole day event. There were folks from marketing and ops team who ran some of these focus groups. And when we got feedback about the immersion, it was very clear that everybody realized that when researchers are not doing the moderation, the kind of data that you get is not good. And the conversations were not that interesting. They didn’t feel that it was a good use of their time. So I think you can have your stakeholders experience it, that it’s not that easy to do moderation. – Mani Pande

    Show Links

    Help other people find Dollars to Donuts by leaving a review on Apple Podcasts.

    Transcript

    Steve Portigal: Welcome to Dollars to Donuts, the podcast where I talk with the people who lead user research in their organization. In case you don’t already know, I recently released a second edition of my classic book, Interviewing Users. This new edition is the product of 10 more years of me working as a researcher and teaching other people. It’s bigger and better. It’s got two new chapters, a lot of updated content, new examples, new guest essays, and more.

    As part of releasing this new edition, I’ve been on a number of podcasts myself, including a conversation with Jane Portman that was part of her podcast, UI Breakfast. Here’s a quick excerpt from that conversation.

    Jane Portman: As you’re training other researchers, you’re training experienced researchers, I’m thinking, what do you feel is common knowledge that they’re mastering well and that we’re all like good at and what things are surprisingly difficult?

    Steve: You know, I think especially for people that are, they have a little bit of experience, but they’re kind of starting to blossom a little bit. One of the things they often need the most is confidence. And so people will often describe a scenario that they were in. People are messy and people are unpredictable and so all these things happen and so they go in with the best of intentions and plans and then things change a little bit. Somebody mentions their divorce, they don’t know what to do. And so I feel like my job isn’t to tell people that they’ve screwed up and that’s not how you do it. I think my job is to tell people that the thing that you encountered is very common. It’s a thing that a lot of researchers struggle with. I try to handle it this way, but there are situations when I handle it this way. Like I think I have a lot of like specific guidance and best practices, but all of those come with a lot of subjectivity and that it’s sort of the nature of the work to be a little confused or uncertain and to have to try things.

    And by the way, there’s no right choice. When someone mentions their divorce, you know, not even me versus you, like me versus me. The next time I would do that interview, I would handle it differently. There’s a moment and if the divorce was brought up in minute three versus minute thirty, like it would play out differently. We’re not algorithms, we are, I think improvisation is a big part of it. So I think I want to help junior researchers feel okay about that there’s no one right way to handle this and that their way of, the fact that they felt confused and uncertain in a situation and they made this kind of, here’s how they addressed it, it’s, I rarely tell them like, well that’s the worst thing you could have done. It’s usually, they usually are doing their best. That confidence to make a different choice is kind of what I want to help somebody with.

    You know, more experienced interviewers, I think I like, I like working with them because I think we can have a better, richer conversation about what are all these choices and what are the differences between them and you know, I love being in a workshop where I’ve got people with different experience levels because then I might give some guidance and then we can have different people suggest, well you know, here’s what I’ve done. Everyone can learn from each other and sometimes we can debate and I don’t mean that in like a right and wrong way but I think someone with experience, we can have a really interesting conversation where we look at one scenario and four or five different ways to handle it and we might disagree on what is sort of the optimal way.

    I think what that surfaces is that as individuals, as interviewers, we’re all wired differently and we all have different instincts and different personalities and you can get away with things that I can’t because of my age or my gender or my energy and I can get away with things that you can’t and getting away is maybe there’s the wrong framing but there’s just so many interesting choices and it’s I love hearing about other people’s things and thinking like, oh could I have that amount of friendliness or that amount of stillness or that amount of curiosity or that amount of empathy, you know, could I present those things in different amounts? You know, that’s part of being an expert interviewer is you have your own personality and your own strengths and you can exercise different facets of that as the situation requires. I think, you know, a more experienced interviewer has more adaptability to that, has their core, you know, but also can put on different authentic, true to themselves faces and energies to kind of support different kinds of situations that are going to come up in these interviews.

    Again, that was me speaking with Jane Portman on UI Breakfast. You can check out the whole episode and you should totally buy a copy of this new edition of Interviewing Users. You can also check out portigal.com/services to read more about how I work with teams and companies.

    But now, let’s get to my conversation with Mani Pande. She’s the Director and Head of Research at Cisco Meraki.

    Well Mani, thanks so much for coming on the podcast. It’s great to get the chance to chat with you today.

    Mani Pande: Thank you, Steve, for having me on the podcast. I’ve been listening to your podcast for several years, so it’s great to be a guest on it.

    Steve: Excellent. Do you want to start us off with kind of a little introduction to you and then we can build a conversation from there?

    Mani: Sure. So my name is Mani Pande, and currently I lead the UXR team for Cisco Meraki. And, you know, Cisco is a really big company, and I am part of the networking division. And within the networking division, like, there are two types of primary products, I would say. Meraki, which is their SaaS offering, and then enterprise networking, which has primarily been their on-prem offering. And my team works across both Meraki and enterprise networking. So it’s a pretty big team, and it’s a big part of Cisco’s business because for Cisco, networking is still bread and butter. So there’s a lot of interesting work that the team does, and a lot of the work that they do does impact the products that we ship and also has a — hopefully has a lot of positive impact on Cisco’s bottom line.

    Steve: Do you have any examples you can share about situations where research impacted something that the product was doing?

    Mani: Yeah, so one of the — my manager is a big believer in the double diamond approach, you know, starting from doing foundational work and then moving on to doing — you know, once you have the design, testing it, doing concept testing, doing usability testing, and once you have shipped it, you are also — you also have some kind of metrics that you have used to define success and trying to measure those. So there are several examples that — where the team worked throughout the double diamond process and was able to make an impact not only just in defining, you know, what kind of product do we want to ship, but once we had some concepts, they helped define what are some of the hypotheses that we want to test, did a lot of concept validation, did a lot of usability testing because we had a lot of designs that we wanted to test, so pressure tested them in front of our customers.

    And then, you know, once we had shipped the product, like we wanted to see that, you know, our customer’s happy with it. Do they really like it? Was it worth the effort? So, you know, doing some kind of like customer satisfaction surveys as part of getting continuous feedback from customers. And also, you know, even when we did those customer CSAP surveys, like we got a lot of open-ended comments, and we got some feedback from customers such, “Oh, you know, this product, like this is not working or that is not working.” So those were good early signals of what we needed to change before it escalated into a support issue.

    So there are several projects that we have worked on. At Meraki, we always call our projects by planet names. So there’s a project called Jupiter. There’s a project called Aurora, which is all around, you know, providing more visibility for on-prem devices to show up on the Meraki dashboard so you can see them as well as manage them through the SaaS product.

    Steve: And so is it the same, roughly the same, group of researchers that are working through the stages of the double diamond like you described?

    Mani: Yes, I would say like it’s like, for example, for one of the projects that I mentioned, Aurora, it was a same researcher who worked through the whole process and obviously in very, very close collaboration with the designer. So they were a tight-knit team that worked throughout the double diamond process. So, you know, yes, they are.

    Steve: Is that an example of a researcher being embedded? I know that’s kind of a buzzword. Is the researcher embedded with that team in that case?

    Mani: And I know there are a lot of people have a lot of, especially research leaders have a lot of opinion about whether you should have an embedded model versus not an embedded model. In our case, I think just because of the complexity of the domain, having an embedded model is extremely important. I, you know, I have worked across, like I’ve worked at Wikipedia, I’ve worked with that Lyft, I have some B2B experience, I used to work at Success Factor where we made HR software, I worked at Samsung, I worked as a consultant where, you know, you just need to know a little bit to be effective.

    But I have never, ever worked in such a technical domain, which is networking. Like every day at work, I feel like I, sometimes I feel like, okay, I know a little bit, but then there are days where I feel like I know my thing. So for us, I think the embedded model is extremely important because you need to have a little bit of domain expertise to be able to do research a little more intelligently and more meaningfully.

    And another thing like I feel, this is my perspective that an embedded model works better. And, you know, even when I worked at Lyft, which I would say in terms of complexity is nothing compared to enterprise networking, we still had an embedded model because what an embedded model enables is relationships, which are harder to form if you’re not in an embedded model. Like for researchers, one of the things that, you know, we do is that we lead or we bring about change without authority. So for that, I feel like having relationships is extremely important.

    In fact, you know, like there’s this article that I came across recently from Harvard Business Review, which, you know, they had listed like the three things that you need to do to lead without authority. One of them was relationships. Like having relationships with your PM partners, with your design counterpart, engineering, data science, it’s extremely important to be able to bring about change, to be able to show that, you know, what you are hearing from your customers matter, to be able to change hearts and minds. Because I sometimes feel that we are in the business of changing hearts and minds. You know, a lot of people, like I have worked with a lot of PM partners, they have very, you know, some of them have very strong opinions. So to be able to bring about change, I feel that having strong relationships is extremely important. So I am a big believer in the embedded model.

    Steve: Are there other things that they have to do or that you encourage them to do to build the relationships in the way that you’re talking about?

    Mani: There are various things like I have done all my career and then I also encourage my team to do. So one of the things is I feel like to have a good relationship, you need to bring, and this is more important if you’re in IT and you’re doing the research yourself, is to bring your design, to bring your stakeholders along with the right when you are conducting research. So that’s one thing that I always encourage my teams is invite people to come to your research sessions. Make sure that they are involved in helping you come up with the insights. Like obviously the researchers are going to do the heavy lifting. Like you don’t expect the PMs or the engineers or the data science to do the heavy lifting. But do a workshop with them and ask them, like what did you hear from some of the interviews that you attended? Like what resonated with you?

    And also the other thing that comes out with that is that you have less resistance towards the end. People are less likely to challenge you. So it ensures that everyone is kind of on the same page from the beginning. So I feel that it’s also good for relationship building. So that’s one thing that I always tell the ICs to do.

    And myself as a research leader, like I obviously try and build relationship with whoever are my counterparts. And I also try, I’ve always done is like build relationship, especially with, you know, PMs like who are like the head of the product that your team leads. So for example, when I was at Lyft, like I worked on the Driver app. So I used to meet with our head of product management for Driver at least one supporter to make sure that I also had a good relationship with them.

    Another thing that I learned through that was working with them, it was easier to figure out what we should do long term because a lot of product managers are only thinking about, you know, what they have to deliver within the quarter or maximum the six months. Like if you build relationships with the leadership, like it enables your team to work on more long term projects. So that was just a learning that I had. And I always try and do that is like have a good relationship with the head of product, like people, you know, two or three levels above me. I mean, they have a lot of ideas.

    Steve: How does the relationship support the longer term conversation?

    Mani: Like they are thinking more about, you know, like where the business needs to be. They’re not so focused on the product roadmap. They are not so much thinking about, you know, this is the feature that I need to ship tomorrow. And also, you know, you can get them to say yes to something that you feel that the team should be working on, but that they might be a little bit of pushback from the product team. So if you get their blessings, you know, you can be working on projects.

    Like, you know, as researchers, we have a lot of opinion. And I always tell people like you have to have a point of view. You are spending so much of time with the customers. You’re talking to them. Like if you don’t have a point of view on what research we need to do or what matters to our customers, then you’re probably not doing a good job. So like let’s say if you have a point of view of some research that needs to be done, but it’s much more long term. You know, you would probably not going to see the impact or nobody is thinking about in terms of their roadmap for the next quarter or the half. Like it’s easier to get a buy-in from the executive. And to be able to get that buy-in, like you have to be able to get that buy-in like you have to have a relationship with them. That’s what I have experienced and it has worked for me in the past. That’s what I do, but I also encourage like my ICs to do it.

    Steve: So it’s you as the leader are having the relationship and the buy-in for the longer term pieces. This is not things that your ICs and researchers are focused on. This is your role as the leader.

    Mani: But you know, a lot of people feel intimidated to go and meet the VP. So I encourage them to do it. And in fact, you know, like one of the other things when I used to do this was I would take like if I did this conversation, I would invite the ICs who were relevant for that conversation to be part of that conversation. So they didn’t feel intimidated to be having that conversation and they could also participate in that conversation and hopefully meaning going forward can do it themselves without me being there.

    Steve: So when you talk about relationships, you’re creating them yourself. You’re encouraging others to do that. And then you’re, I guess, enabling or facilitating relationships between other people. You’re talking about a number of different fronts. for building these relationships– the workshops, inviting people to come to sessions, having these sort of, I guess, planning meetings or discussion meetings.

    Mani: Yeah, ultimately my role as a research leader is to help my, enable my team. That’s how I think about it. That is one of my important goals. I would say not the only goal. So whatever I can do to enable that, I always try and facilitate that.

    Steve: One thing that I’ve heard a lot and that I experienced myself is that the kind of relationship building you’re talking about, whether that’s workshops, participating in interviews, or just kind of meeting, is harder or it’s at least different when work is remote. And I wonder, have you seen changes in how you’re doing this relationship building or how you’re helping others to do it over the past few years?

    Mani: Yeah, I mean you can think of it as a glass half full or half empty. That’s how I think about it. I think it’s still possible. In today’s world, like we have so many more tools. Like I would say this would have been so much harder 10 years back. Like doing a remote workshop is so much easier. Like we are all like you can use FigJam, you can use Miro. Like even our stakeholders, like everybody knows how to use those tools. And there are so many templates that you can leverage. Actually, you know, in some ways it’s easier to do it remotely versus do it in person.

    But I agree like, you know, in person there’s a lot more energy to it. There is something about being in person at the same time. Like the wife is very different. But with the tools that we have, I think it’s just so much easier to do that. The last two jobs I have started, I’ve started them remotely.

    Steve: That’s a good reframe on my question. I think glass half full, glass half empty is a lovely way to look at it.

    Mani: And honestly, like if I compare them to the last two jobs that I have, like did I feel any difference? I would say it’s a little harder. It takes a little longer, right? But it’s not impossible. And you can get to that same level of relationship over time. So the one thing that I have done, the one thing that I have done is I have done a lot of work. And the one thing that I have done though is like I’ve also had this privilege like, you know, when I worked at Lyft or at Meraki, I live in the Bay Area. The offices are in the Bay Area. So when I go to the office, like I don’t, I think of those days as days of relationship building. So I go and I meet a lot of people. And at home it’s obviously a little more focused work or you could be in bigger meetings. And I think about work on how I spend my time in the office has also changed a little bit in the last four years.

    Steve: Right, we might not have done explicitly relationship building, maybe not even having that as an intention, going into the office to build relationships.

    Mani: Yeah. So not only just with stakeholders, I think it’s also important for the team. Like when I was at Lyft, our team, you know, like when we went to the office, we all went to the office. So it became more like a team day. Not very productive, I would say, but it was good for, you know, team bonding and it was good for us in terms of, you know, coming to know each other and just building our relationship and figuring out what kind of a team we are together.

    Steve: You mentioned this article about change without authority and you said that they listed three things and one of them was relationships. Putting you on the spot, do you remember what the other two were?

    Mani: I do actually because I did a blog about it. So I’ve given it a lot of thought. The other one was expertise. I think that’s also an important one for researchers because in the last, you know, few years, there’s all this, everybody talks about DIY research, you know, you can farm out research, everybody can do research, and there’s also this perception, you know, how hard is it to do research? Like, you only have to talk to people, right? Like, you and I are talking. We could be doing research right now. So I think it’s important for us as researchers to show our stakeholders that, you know, research is not easy as it appears. It’s very easy for us to do bad research and get bad insights from that.

    So one of the things that I did at one of my previous jobs was we used to do these immersion events where we would bring, you know, everybody who worked on, who was our stakeholder, to come and listen and talk to our customers. And we would do these focus groups if they were like a whole day event. And in the beginning when we started that, there were folks from marketing and ops team who ran some of these focus groups. And when we got feedback about the immersion, it was very clear that everybody realized that when researchers are not doing the moderation, the kind of data that you get is not good. And the conversations were not that interesting. They didn’t feel that it was a good use of their time. So I think you can have your stakeholders experience it, that it’s not that easy to do moderation.

    In fact, I feel like when I was in IC and if I times, like I would do four to five interviews in a day, I would be mentally exhausted. You know, doing moderation is one of the hardest things to do because you’re multitasking at another level. Like you’re trying to, I mean, I’m a little old school, I would take notes, I would try and listen to what the person is saying and figure out like, do I follow my interview script? Do I change it? So just having people experience that is important. So I think show and tell could be one way of showing that, you know, research is hard.

    And then when it comes to quantitative research, like, you know, writing surveys, I think that’s a pretty specialized skill. I have seen people, you know, think that they can write surveys, but there’s a lot that goes into it. You can get absolutely wrong responses if your question is not well designed, if your scale is not well designed. So for quant research, like I have a little bit of a strong opinion on that, that I don’t think it should ever be DIY. It shouldn’t ever be unskilled stakeholders to write a survey. So that’s one thing, like, you know, just showing your expertise is one thing that you can do to leave without authority.

    The other one they mentioned was about business. Organizational understanding is the third one. That is very important. That, you know, one of the things I will, as I said, you know, I have a long career, I worked for 20 years after grad school. I have seen often like researchers resist or don’t want to have a good understanding of how the company makes revenue, earns profit. They’re a little wary of that part of the business. I think it’s also important for us to understand, like, how does the company make profit? Like, for example, if you work for a B2B company, like, I would say that you should have a relationship with the sales team to understand how do they sell, what do they sell, like, what is some of the feedback that they get from customers.

    So I think having a little bit of understanding of revenue and profitability is important. Like, for example, like, if you work for a company that’s in the gig economy, which has, you know, which has a marketplace, like, it could be Lyft, Uber or DoorDash, when that is the crux of the business marketplace. That’s how the company makes money. So understanding the dynamics of the marketplace is an important thing that a researcher should do. Let’s say if you are a researcher at Uber and you’re working on the Rider app, you should still have an understanding of the marketplace. Usually, you know, it’s the marketplace is kind of on its own, but irrespective of whether you work on the Driver app or the Rider app, you should have an understanding of how, you know, Riders and Drivers are matched, because ultimately that’s where the secret sauce happens and that’s where the company makes money.

    I think a lot of us come from academic background and maybe a little bit of purist background. So maybe it could, this is just a hypothesis and that is where it comes from. I mean, I would say to a certain extent, even I had it, like, early in my career. It took a while for me to realize that understanding revenue, profitability is important. One thing that I would say, like, most researchers do agree on is understanding business priorities and, you know, doing research that aligns with business priority. I don’t see, I have never seen that resistance on that, but when it comes to revenue and profitability, I have seen researchers, like, have a little bit of resistance to that. Like, for example, I have some friends who work at Facebook and Google, you know, how they make money is ads, but many of them did not want to work on the ads team.

    Steve: That seems different to me and just thinking about myself. That seems different to me than understanding like, how does the system match riders and drivers? When you say that, that kind of sparks my researcher curiosity. Like I think we would want to know that because what’s behind that secret wall and how do the gears all mesh? But again, my own bias here when you say working on ads is not, I kind of get that feeling too. Like I just think like, ugh, ads. And so to me that seems different. And I don’t know, I’m not trying to pin you down on something because you’re talking kind of subjectively what we each have our impressions. But I think this idea that, to your point about changing without authority, that there are these important things to understand. I guess there’s a difference between wanting to work on Facebook ads as your project and having enough of an understanding of the revenue model, which impacts everything you would ever do research on at Facebook I think.

    Mani: So I think both. What I’m trying to get at is understanding the money part of the business is important. Like, and the money part of the business for different companies is different.

    Steve: Yeah.

    Mani: For a gig economy, it’s the marketplace. For Google and Facebook, it’s ads. So having some understanding of how your company makes money is important for researchers.

    Steve: And not to flog a dead horse here, but I feel like companies like Google and Meta or Facebook, it seems like their culture is such that how the company makes money is sort of kept in a separate box and we’re going to come here and work on whatever the latest amazing thing that’s going to change the world Internet through balloons, you know, some amazing project. And so speaking out of my hat here, but I have some empathy for people that don’t want to think about the money because they’re not being sold that as an employee. I don’t know if that’s true. I’m hypothesizing that the company culture kind of keeps those things in separate buckets. But if you work at Lyft or Uber or DoorDash or Meraki, there’s a product and a service that’s much more essential to the conversation they’re having. Again, this is not my direct experience. I’m just kind of — I’m just giving you my biased interpretation of what you’re describing.

    Mani: I think just the basic level is probably, I’m sure, like, you know, they keep it under wraps to a certain extent, but just like having a basic understanding and not having an aversion to it is important.

    Steve: So when you talk about these factors, right, the understanding the business, so there’s some specificity, the expertise and the relationships, and you talked about Cisco and Meraki being just the level of complexity, the sort of technological and I guess industry specific stuff. Does that complexity become a compounding factor or something in trying to achieve those levels of those three factors that you’ve brought up?

    Mani: I think especially for something this technical, having some domain expertise matters. And that goes back to having some, that goes back to the first one that I was talking about, which is expertise. Like having some domain expertise becomes important because if you want to have a meaningful conversation, you know, if you’re doing an interview, like you need to have some basic level of understanding to have a meaningful conversation. I know in research we say that, you know, no question is stupid, but if you have zero understanding, you’ll only have stupid questions, at least in this kind of a domain. So I think having some understanding is important, and that’s what I tell all my researchers.

    Like after I joined Meraki, I have this cheat sheet I call “Mani’s Cheat Sheet” about networking, and every time I hear something that I don’t know, I get a version from Google and then I get a chat GPT version, like please explain it to a middle schooler version, which actually is the version that works for me. And that cheat sheet, as I’m like, “Huh, like 20, 30 pages long?” It just keeps on increasing. I’m not going to be an expert on networking. I don’t want to be, but I just want to know a little bit to be able to have meaningful conversations and to be also able to provide, you know, feedback to my team. Like sometimes, you know, when we have an important presentation, like I work with them to figure out like what are the insights, like what are some of our action items, but if I have no understanding of the domain, I can’t do that.

    Steve: Can you talk about from the period of time that you came to Meraki, like what that progress has been and what research has gone from to where it is now?

    Mani: So Meraki has seen incredible growth in design as well as research in, I would say, in the last one, two years, which is the opposite of where, you know, design as a field and UXR as a field is going at other companies. So we have, it’s also because, you know, as I was saying, like the company has really adopted the double diamond approach. So they see research as being an integral part of how you build products. So that’s one of the reasons why we have seen this big, massive growth in the last one year. And the reason why I came here was, you know, as I was telling you earlier, you know, like Cisco had these two products, like the SaaS product, the on-prem, and the strategy now is to, you know, convert, which is, you know, have like some of the on-products be able to be managed in the cloud.

    So they brought these two teams together, the design team for the on-prem side and the design team from Meraki. So that’s how I got hired as the head of UXR because then, you know, the team grew because you had two different teams that merged and the complexity of the business and the complexity of the problems that the team was expected to answer grew because now you had two different businesses that you had to support. So that’s how I got hired and, right, and, you know, our team has grown quite a lot even in the last one year. Like I’ve hired, like just in March, three people have joined the team. And it’s also because, you know, everyone agrees that, you know, you need to do research if you want to build better products. So there is this hunger for research, so that’s the other reason why we’ve been able to grow despite, you know, the market going in the other direction.

    Steve: So you came into this newly formed, newly merged organization that already had an appetite for a belief and a commitment to research.

    Mani: Yeah, and that appetite has been growing. I would say it’s been a steady state because there are more and more teams that we are working with because Cisco networking is huge. As I was saying earlier, it’s like it’s their bread and butter. It’s the major chunk of their revenue. So there are more and more teams that we are working with and that’s one of the reasons why I’ve been able to hire researchers to support teams that did not do research previously but want to do research now.

    Steve: Are there things that you are doing in your role that are behind that? In addition to there being these other teams, but do you think you’re responsible for the increasing the demand in any way?

    Mani: I would say it’s a team effort. I would not put — I won’t say that it’s just me by myself. But obviously, you know, as a research leader — or I would say as a research professional, we have to evangelize research as much as we can all the time.

    I can talk about my time at SuccessFactors. You know, when I joined SuccessFactors, they already had an IPO. Then they were bought by SAP in 2012 for like $3.5 billion. So the company was pretty successful, but they never had research at that time. And one of my — like what I had to do in the beginning was, you know, just evangelize research and make sure that it became part of how we build products. So in the beginning, I did a lot of like evaluated research to show impact and then, you know, slowly as our team grew, we — I mean, continued to do obviously evaluated research, but we did a lot of generated research as part of new product launches.

    Steve: So you talk about that situation, they were bought into research enough that they hired you, but it doesn’t sound like the hunger that you’re talking about now wasn’t there at that time.

    Mani: And also it’s like, you know, we’re talking 2013. I know the field has changed quite a lot in that time. But yes, the hunger wasn’t there. So my job was to create a hunger for research. And it was like very different tactics. Like I did a lot of usability research because that gets you — it’s a quick hit. That’s how I would describe it at times, you know, usability research. If you are looking for quick impact, you can get it pretty fast.

    And you can get — you can find people who are on your side who become evangelists of research pretty easily. So that’s what I did a lot when, you know, when I had to build a team there because, you know, there was no research team. I wouldn’t say that — I mean, there was no research majority.

    Steve: I hear relationships and expertise at least in that story.

    Mani: Yes, there was because I remember when I joined, there was some usability testing that had been done. But whoever did it, they did not know how to define usability tasks. So when it was an unmoderated test and user testing, people were very confused about what was expected from them. And it was very clear that you needed somebody who knew the basics of usability testing to have — who should have set up the test.

    Steve: Just switching topics slightly, you’ve kind of brought us back in time to some different roles that you’ve had and I wonder if we could go further back and maybe could you talk about how you entered the field of UX research.

    Mani: I did not think I was going to become a UX researcher, honestly. Like I have a PhD in sociology. I thought I was going to become a college professor. That was my goal. But I’m — my husband is an engineer. So we moved to the Bay Area. And I looked for teaching jobs in the Bay Area. There were none. So I moved — I went — I got a job at Institute for the Future. I don’t know if you know about them, but they are a forecasting research company. So they do a lot of tech research. And I started working there for five years. And I did not even think the kind of research that I was doing was, you know, user experience research, like product strategy research. But that’s the kind of work that I did there.

    One of the biggest clients that we had was Nokia. And I did a lot of ethnographic research, you know, going across — traveling to people’s home in India, U.S., and Brazil, and, you know, trying to understand, you know, how do they use mobile phones? Because this was like 2006, 2008. It was still a new phenomenon. And that helped — that was to help define Nokia’s strategy, especially for these markets for the next one to three years. And from that, I kind of transitioned into UXR. And in fact, you know, when I finished my Ph.D., I did not even know that there was an HCI field I completed in 2004. So it was still pretty early days of — at least from my perspective for the field.

    Steve: When you found yourself doing global ethnographic fieldwork for Nokia, did you see any difference between how you were working in that context and the skills and approach that you had developed in your PhD?

    Mani: There are cultural differences that you have to be aware of. One of the biggest things is you should be ready for anything. There are a lot of unknowns. I remember, like, we were doing this project, but we were working for Nokia. Like, we had — I was working with our clients, and we were supposed to go and interview someone, and the person didn’t show up. So there’s a lot of those things that you got to be ready for, like, you did not prepare for. Like, very rarely it’s happened to me that, you know, I did ethnographic research in the U.S., and we had no shows. But it’s happened in India so many times. It’s happened in Brazil. So you got to have these contingency plans and go with the flow. And I think those are some of the things that you have to be aware of.

    And also, you know, especially, like, when I worked in Brazil, like, I obviously don’t speak the language. Like, I do not — like, for me, obviously, India is easier because I have lived in India. I know, you know, what the culture is, like, what are the things that you do. Like, for example, if you go to India — to anyone’s house in India, they are going to offer you tea, coffee, something to eat, and it’s polite to eat it. And it’s impolite if you don’t eat it. And when I went to Brazil, like, I had to work with translators, which I had never worked with before. So one of my big learnings was that it takes you twice as much to do the same interview because they are translating it for you.

    So I think there are a lot of cultural nuances that you have to be familiar with when you go — especially when you do research outside of, you know, U.S. and maybe to a certain extent even Europe.

    Steve: And did you see differences at that point in your career where you’re coming from an academic environment to a commercial environment? Were there points of transition for you as you moved from one to the other?

    Mani: Yes. In terms of, like, I remember when I wrote my PhD dissertation, I took a year to do the analysis after I had finished my qualitative interviews. So I — and, you know, when I — this Nokia one that I was talking about, like, we were sharing research inside from every interview every day. So I think that is the big transition that you have to make. Like, if you come from academia is the pace. And also thinking about, you know, how — like, how does this impact the business? Like, how does this impact the product? And also working with so many stakeholders. Like, those are some of the big changes. Like, when I was in academia, the only time, like, I worked with others is my advisor provided me feedback for my dissertation. And when I published, I had these three reviewers who gave me feedback for some of the articles that I published. But, yeah, you know, the way you work with stakeholders is so different. And also you got to be, you — okay, we’re doing a lot of quick and dirty work, which obviously, you know, you don’t do it in academia. Especially if you do that, it won’t get published.

    Steve: So after this Institute for the Future experience, was user research that was your career path at that point?

    Mani: Yeah, and then I joined Wikipedia, which was actually a very amazing experience in terms of what I was able to do because — I mean, Wikipedia is like I have never met anybody who doesn’t like Wikipedia. I, you know, I’ve worked in different — as I said, you know, I have worked in different industries. Often we more than often, like, meet customers who say, like, I don’t like this about your product. I don’t like that about your product. But for Wikipedia, it was very different.

    So one of the things that I did when I joined Wikipedia was I did the first ever survey of Wikipedia editors. So Wikipedia is, as you can probably guess, from their mission, unlike most companies, they barely do any tracking. Like, they don’t use, for example, cookies. So we used to rely on ComScore to even get the numbers for active readers for Wikipedia because they were not using cookies. So they did not know at that time, like, what percentage of editors are women because Wikipedia, even today, has a gender problem. So I did the first ever surveys of Wikipedia editors, and the answer at that time was 9%. The only 9% of people editing Wikipedia were women. And as a result of that, a lot of — there are fewer articles about women on Wikipedia versus men. So there’s a big gender bias in Wikipedia. So that is some of the work that I did when I joined Wikipedia, which I thought — and I also worked on the redesign of the mobile app. So I again did ethnographic research in the U.S., India, and Brazil for the redesign of the mobile app for Wikipedia. But all that work was very fulfilling just because of the amount of impact that you could have on users. Because I don’t remember the numbers now, but at that time there were 250 million active readers on Wikipedia every month, and there used to be like 7 to 8 billion page views.

    Steve: As a user researcher over your career and maybe as a leader now, I don’t know, how do you think about different kinds of methods or different kinds of approaches to getting the data that you want to bring your team?

    Mani: I mean, I’m trained as a mixed methods researcher, and when you think about mixed methods, I like to believe it’s on a continuum. There are some people who do small surveys and think that they are mixed methods researcher only doing descriptive statistics. There are some people who do regression analysis, who will do, I don’t know, structural equation modeling, latent class analysis, who are also mixed methods researchers. So it’s a little bit on a continuum, even for qualitative, right? Like you can be an ethnographer. You can be somebody who really believes in participant observation, or you can just do qualitative one-on-one interviews. With that said, like I do both because I just happen to be trained. When I did my PhD, I specialized in research methods, which meant that I had to prove that I could do both quant and work. So I did a lot of statistics. I did a lot, you know, I took a lot of qualitative courses. And in fact, you know, my dissertation was a qualitative dissertation, despite, you know, having a very deep background in stats. So I do both.

    But with that said, I would say you should be methodologically agnostic. Like it should not, depending on what’s the problem that you’re trying to solve, you should figure out what’s the research, right research method. So I think that is what is more important. And that’s why I feel like having a mixed methods background is good, because then you just have more methods in your toolkit. So you can figure out like, you know, for this problem, probably a survey is a good solution. Or for if I want to find the answer to this problem, maybe doing ethnographic research, qualitative interviews is the right thing to do. And, you know, you can use all these methods throughout the process. Like you can do a survey, you know, along with some foundational research.

    Like I can give you an example. When I worked at Wikipedia, I’m very proud of the feature that we released. I’ll talk about a little bit about that. So when I worked at Wikipedia, I interviewed this person in Salvador, which is, you know, in north of Brazil, in the Bahia district. And he was a Wikipedia editor. And obviously, like you’re a Wikipedia editor, you’re a big reader too. And at that time, he told me that, you know, he wanted to read Portuguese Wikipedia, but he felt that it was not mature enough. Some of the articles did not have the kind of detail that he was looking for. So he would often flip to English Wikipedia. So I thought that was an interesting thing that he mentioned. So after we did the ethnographic research, we followed it up with the survey and we asked people like, you know, do you, how many languages Wikipedia do you read? And, you know, we got data. Like if you read English, what else do you read? So obviously it was clear that, you know, English is the primary one. But more often than not, people read other language Wikipedia’s too. Especially outside of, you know, US I would say, even in Europe, like German Wikipedia is very big, but the German readers are also reading English Wikipedia. So, and that was also very clear in the survey data that we got. So we introduced, it’s even there today in the app. We introduced what we call the inter-language Wikilinks, which made it really easy to toggle between different language Wikipedia. So, for example, like if you search for an article, it tells you that, you know, this article is also available in this other language. You can set your primary languages, you can set your secondary languages. And all this came out through this, you know, ethnographic research that was done, followed it up with a survey, gave us more confidence that, you know, this would be a useful feature. And it still exists like 10, 12 years after we did that research.

    Steve: That’s a great story and one detail that excites me is that small data point from one method. You find new questions in research. You found something and then you did some more research to kind of understand that phenomenon that you didn’t know.

    Mani: Yeah.

    Steve: That’s such a great example of that. That the ethnography was not sufficient to make something new, but it did point to a whole new effort to understand something, which then led to that feature.

    Mani: Yeah, but ethnographic is that insight, which I don’t think I would have got from a survey, right? So it was, the survey just gave us more confidence that, yes, I think that’s worth building.

    Steve: Yeah, that’s a good clarification. We’re going to talk about mixed methods and about different kinds of tools kind of feeding together. You talked early on about relationships, so it just makes me think about some of the different roles that we have around data science as a big part of what so many companies are doing. How do you see kind of the relationship between UXRs, mixed methods and otherwise folks and data science?

    Mani: I think there’s a lot of synergy between what UX cards do and what data scientists do, because ultimately we are both researchers. It’s just that we are looking at different type of data. But if we bring it together, I strongly feel that we can have a much more holistic understanding of our users. And, you know, in my career, like I’ve had, especially at Wikipedia and Lyft, I was lucky to work very closely and actually even at Meraki, work with data scientists. So there are various ways we can work with data scientists. Like, for example, like if you’re doing segmentation, you know, often we do segmentation of our users just based on attitudinal data. But you can get the behavioral data from data scientists and bring that together to have, to just have a better segmentation model of your users. Various companies, they have dashboards to, you know, look at how their customers are doing and tracking. Many of them, especially in B2C, tend to be more behavioral based versus attitudinal. So I think there’s an opportunity there to have dashboards that not only just tell you what are the users doing, but also how are they feeling. I think that’s an important story to tell. So there’s that one can do.

    Then there’s also an opportunity to work with data science during A/B experimentations. Like in my past companies, I have worked with data scientists when they have an A/B experiment to, you know, do a survey along with the experiment to see like if there are any differences between the control and the experimental group on not only what they’re doing, but how they are feeling. And the one thing that I learned is that there’s a little bit of lag. Like if people are unhappy, it takes a little while before they stop doing the thing. So like, you know, doing that survey gives you an early pulse that, okay, maybe this experiment is not working as well as it should. Maybe we should tweak something so that the users are happier. So I think that is one thing. So those are some of the opportunities that I see working with data scientists.

    Steve: Is there anything else today in our conversation that I should have asked you about or you want to make sure that we talk about?

    Mani: I can’t think of anything else right now. I’m sure I will think of something later.

    Steve: Well, Mani, it’s been really lovely to speak with you and learn from you. I want to thank you again for taking the time to chat and share all your stories and experiences. It’s been great.

    Mani: And thank you, Steve, for having me on the podcast.

    Steve: There we go. Thanks for listening. Tell your friends. Tell your enemies about Dollars to Donuts. Give us a review on Apple Podcasts or any place that reviews podcasts.

    Find Dollars to Donuts in all the places that have all the things or visit portigal.com/podcast for all of the episodes complete with show notes and transcripts. Our theme music is by Bruce Todd.

    The post 39. Mani Pande of Cisco Meraki first appeared on Portigal Consulting.
    14 March 2024, 2:06 am
  • 1 hour 13 minutes
    38. Vanessa Whatley of Twilio

    This episode of Dollars to Donuts features my interview with Vanessa Whatley, UX Director – Research & Documentation at Twilio.

    For many years, I had anxiety and regret around not starting my career in the field that I’m in sooner because I felt very very lost stumbling through all of the different fields and roles, and only in hindsight do the dots connect. I’m better at what I do now because I learned the lessons in all of the different jobs. Even something like being an executive assistant, I was able to sit in on more senior leadership meetings, and I really early picked up on short attention span, How do you get your point across concisely, What do they care about? And I think that made me a better researcher right away, even as I was still learning the practice because it taught me something about communication…I think all of those little pieces along the way just shaped how I interact with people and I think has made me better at what I do today. Maybe just know that it’s all connected somehow. – Vanessa Whatley

    Show Links

    Help other people find Dollars to Donuts by leaving a review on Apple Podcasts.

    Transcript

    Steve Portigal: Welcome to Dollars to Donuts, the podcast where I talk with the people who lead user research in their organization. As part of the upcoming Advancing Research Conference, I’m teaching a full day in-person workshop about user research. It’s March 27th, 2024 in New York City. This is a rare opportunity to learn about interviewing users from me in person. You’ll also have the chance to engage with other researchers at different levels of experience in an interactive environment. I’ll put the link in the show notes for the Advancing Research Conference with more info and information about how to register. If you know someone who would benefit from this workshop, then please pass this info along.

    The newest version of my workshop makes use of the writing and rewriting I did on the very recent second edition of Interviewing Users, which you should absolutely buy several copies of.

    Shortly after the book came out, I had a conversation with Darren Hood for his World of UX podcast. We got into the intricacies of asking questions. I’ll link to the whole episode, but I’m going to include an excerpt right here.

    Darren Hood: And the topic of chapter six, the title is the intricacies of asking questions. And I love this because this is probably when I’m teaching people about research, it’s one of the outside of the classroom. When I’m talking to people about research, this is the topic for me that comes up the most. This one particular thing that you mentioned, and I’m going to read another excerpt from the book. And there’s a heading here, “There’s Power in Your Silence.” Oh my God, how many times have I talked to people about this? Steve says, “After you ask a question, be silent. This is tricky because you are speaking with someone you’ve never spoken to before. You’re learning about their conversational rhythm, how receptive they are to your questions, and what cues they give when thinking about an answer. These tiny moments from part of a second to several seconds are nerve-wracking.” And I love that because it’s one of the things that I see and people will ask a question. And it’s funny to watch people grind their teeth when the participant is silent. To watch people, hem and haw, the researcher, wants to help the participant and things get out of hand sometimes. I have seen people jump practically across the table to try to guide somebody because they just couldn’t stand the silence. But the title says it, the subheading there or the heading in the chapter, “There’s Power in Your Silence.” And so I’ll hand it over to you to elaborate on this topic.

    Steve: Yeah, and I think you described some of the phenomenon pretty well. And there was a moment in this conversation, I think you, because we’re on video, even though we’re recording audio, we are in video and we’re looking at each other and nodding and doing all the, as best we can over video kind of feedback. There’s a point in which you said, “Oh, I see the gears in your head are turning, Steve, and I’m going to turn it over to you.” And I think, as a trained interviewer, as an experienced podcast, I was like, “You learn what that is.” And there’s been moments where I’m asking a question and I’ll just stop. I don’t need to finish my question. The person is ready to talk. And I’m going to ask a follow-up question. I’m going to ask dozens of follow-up questions.

    So if the thing that they want to say is not exactly what I want to ask them about, it’s better for the whole dynamic to have them go and then me follow on and follow on and follow on as opposed to like, “No, no, no, wait, let me make sure you understand exactly what my question is so the information that you give me perfectly conforms to the parameters which I am articulating.” Like that’s not how research works. It’s this sloppy interface between people that kind of goes back and forth. And so you have to understand your role is to kind of draw that out of them. And so that hem and haw thing, or even trying to help them, as you say, I think is really important because if you can’t allow for that silence, the anti-pattern or the bad behavior that comes out is asking these run-on questions. And the run-on questions are deadly. And in those run-on questions, people start suggesting, so you’re going to ask what could be an open-ended question. Like what kind of microphone do you have for your video calls? But the run-on question is what kind of microphone do you have? Is that a USB mic or is that a Shure microphone or is there that part of your headset? Like you start suggesting possible answers.

    Darren: Yep, yep.

    Steve: And the motivation for doing that, I think there’s a lot of, you have to pay attention to yourself. It feels like you can kid yourself that you’re being helpful when you do that. I’m being helpful. I’m just showing them what examples are. But in fact, it’s because you are, and I shouldn’t say you, I should say we, this is, I’m in this all the time. It’s uncomfortable to stop and just say, what kind of microphone is that? For all the reasons in that quote that you described, like, I don’t know what’s going to happen. I’m going to lose space. I’m going to be seen as an idiot. That person’s not tracking with me. My boss is going to watch this video. There’s all this risk in that moment. It’s kind of like a little, it’s a little abyss that you’re kind of beeping into. But if you start suggesting things, it messes up the power dynamic. It says that for one thing, the participant is required to listen to the, to the interviewer going on and on and on.

    Darren: Right.

    Steve: And also starts to say over time that their answers should be, it’s multiple choice question. So you participant should be giving answers within the format that I have outlined. So you might think that’s ridiculous, Steve and Darren. If it’s none of those mics, the person’s just going to say, no, this is just an old karaoke mic that I brought up from the basement. They’re going to give you an answer that’s outside that list. And the first time they will, and maybe the second time they will, but eventually you are training them as to how to do a good job, which they want to do. They want to do a good job for you. And so they’re not being squelched from sharing their truth about microphones. They’re just trying to get through this interview and do a good job.

    So the more you teach them indirectly what a good job looks like, in other words, one of the following, then the more you risk not hearing from them and not kind of getting stuff. And you don’t, we don’t realize the power we have over people despite being kind and self-deprecating and telling them at the beginning, I just want to hear from you. Just tell me your truth. And, you know, there’s no wrong answers. You can do all that. It doesn’t matter once you start training them what good looks like, then that’s what you’re going to get. So I think it’s, it’s hard. And the quote you read kind of explains why it’s hard and we trick ourselves that we’re helping. So that makes it harder. The risk in this, I think is significant because it accrues to changing the dynamic and the interview and changing what it is you’re going to hear.

    Darren: Yes, yes, absolutely.

    Steve: Again, that was Darren Hood’s World of UX podcast. Now let’s get to my conversation with Vanessa Whatley. She’s the UX director of research and documentation at Twilio. Vanessa thanks so much for being on Dollars to Donuts.

    Vanessa Whatley: Thank you for having me.

    Steve: It’s really great to have you here. So I’m going to just do my cliched opening. The only thing that’s really yet effective, I think, is just to throw it over to you right: away and ask you to give a bit of an introduction to yourself.

    Vanessa: Sure. My name is Vanessa Whatley. I lead our research service design documentation team at Twilio. So my title is UX director, but I have all the UX functions outside of design is how I like to explain it.

    Steve: And what’s Twilio for those of us like me that don’t know?

    Vanessa: Yeah, great question. Usually people think it’s Trulia, like the real estate company. So we’re not that. Twilio is a communications API company, and specifically the portion that my team works on is second So that was a product that was acquired by Twilio a few years back, and we are essentially a customer data platform.

    Steve: What companies or people or roles use Segment and what do they do with it?

    Vanessa: Yeah, we are B2B, so lots of large as well small SMB companies use Segment to essentially get a better picture of their data. So a lot of times companies are collecting data in a lot of different tools, a lot of different systems, and then it’s siloed and they have a hard time reconciling. So if you are, you know, Steve on the mobile app and Steve on the website, and then you’re interacting by replying to an email, you might look like three different people. And so Segment really helps bring all of the data sources together, unify it into a single profile, and then from there it lets companies interact with you in a, I guess, a more intelligent way because they know who you are as a single person rather than three different IDs.

    Steve: Can you briefly put Segment in the context of sort of the larger set of things that Twilio as a company does?

    Vanessa: Yeah, so if you think about Segment as kind of the data layer, so that can be your foundation of you understand who your customer is really well, what they’ve done with you in the past, or even predict what they might do with you in the future. And then from the data piece, a lot of companies want to actually activate on that data. So they might want to send you a text message or send you an email, and so the rest of the Twilio portfolio kind of has more of the communications APIs so that you can go on ahead and like use that data to actually engage your customer. Another good customer example could be someone like a DoorDash where DoorDash is trying to connect a restaurant, a driver, and the person who ordered the food. Instead of them building everything from scratch in their app, the communication piece of me being able to text a driver without having to give my own number and the driver seeing my number and vice versa, you can use an API so that they can communicate without DoorDash as an app having to build all of that native into their platform.

    Steve: But so if you’re a company like Twilio and just thinking about, you know, user research point of view, you’ve got those kinds of users that you’re describing in like that DoorDash example, the end user and the driver, the food purchaser and the driver. But you also have the — I guess it’s some kind of IT or development team that’s using Twilio tools to build that so that their end users or their drivers can all communicate. Are you — from a research point of view, where are you focused, if at all, on any of that?

    Vanessa: Yeah, so on the Segment side, like I said, we’re more so the data piece that is powering a lot of different things. And so you’re absolutely correct. A lot of our customers end up actually being either the data team or engineering team within a company because they’re the ones that are essentially most likely collecting the data, manipulating the data so that there’s protocols and that it’s actionable.

    And then ideally it goes all the way through to a business use case. So that might be a product manager or marketer then making decisions on that data set and deciding, okay, we want to run a marketing campaign or we want to analyze this cohort or this audience.

    Steve: I want to go back to the beginning and you were describing sort of the structure a little bit or the areas of the organization that you’re focused on. And I’m sorry, you had a great catchphrase, which I should have written down because now I’ve forgotten it. Can you go back to that?

    Vanessa: Oh, I was saying everything in UX outside of design. Yes.

    Steve: Okay.

    Vanessa: So there is a different person that manages our full design org, but content design, service design, technical writing, and research all sit within my team.

    Steve: Can you say a little bit about the research team?

    Vanessa: Yeah, so we currently have a team of about five and the makeup of that team has changed a lot over time. So we’ve grown with layoffs. We unfortunately lost a few people on the team but overall I think as the size of the team changed, our operating model has kind of shifted along with that. So we started out being a little bit more embedded and really aligning each researcher to a specific area or product area features. And then I would say last year we really decided to go all in on more foundational work and take a little bit less of our demand from product to try to answer like larger strategic questions. And now we probably sit somewhere in the middle where we do a mix of product work as well as foundational work.

    Steve: I was expecting that embedded and centralized would be the sort of contrasting terms, but as you’re kind of relaying it, I kind of hear you. I think you’re contrasting embedded and foundational. And maybe you could explain what those endpoints look like as you’re kind of moving between them.

    Vanessa: Yeah, I think you’re right. I never really thought about the fact that usually people say centralized. I don’t love centralized research orgs. I’ve worked in that manner before, but I’m a little hesitant to call it that with how we’re currently structured because oftentimes, at least my experience, I’m sure there’s many ways to do centralized, but in my experience that often means it’s a little bit more like an intake request type of situation where you act more like an internal consultant almost and you can cover a lot of different breadth.

    And what I try to do with my team and why I call it foundational is a lot of times the researcher might have still had a focus area that they are stronger in or just cover an entire flow, like a user journey and sit within that part of the product. Or maybe they specialize in a set of personas, but they’re not necessarily bouncing around to any project that comes up and we’re managing bandwidth that way. It’s a little bit more driven by where we think product strategy is going to go and then trying to still align people to spaces that they can gain deeper knowledge in just because of the complexity of our space too. It’s really, really hard to bounce around and be the expert in the marketing persona, but then also the data engineer. And yeah, I think that’s why I use those two terms even though they’re not actually polar opposites per se.

    Steve: So centralized might mean an intake process, which is challenging if that means that anyone gets assigned to anything. I’m kind of steamrolling over the nuance that you were depicting to kind of check and see. Because I feel like when you’re talking, there’s sort of a couple of aspects. One is like what projects are we going to do? And the other is who’s going to do them?

    Vanessa: Correct.

    Steve: So I don’t know. Are you — does the idea of intake, is that in itself limiting or something that you would try to avoid?

    Vanessa: I think it’s a mix. We definitely talk to all of our stakeholders and try to understand what feature level work and even what foundational work do they want us to kind of produce or participate in, collaborate on. But I actually sit down with my team every quarter and we end up doing anywhere from like 90 minutes to two, three hours of brainstorming where I really encourage them. What are the gaps that you see? What are kind of like big strategic questions or areas where you feel like there hasn’t been enough emphasis or we’re not connecting the dots properly because I do think the risk of operating at the feature level means everything’s a little bit more siloed. And as we know, customers experience things in a series of steps or flows or have an entire journey they need to navigate. And so I just try to position my team so that it can really think at that level.

    And I find that sometimes when we do more of the intake model from design and from product that they tend to focus on their scope and their area of focus which might more so sit at the feature level than it does cross-product.

    Steve: Right. So it’s the proactive versus reactive aspect. So people that ask for help from research that aren’t — that don’t know about research as much as you do or your team does are going to ask for the problems to be solved that they think research can help with. But if your team brainstorms, here’s what we’re seeing, here’s where the gaps are, here’s how we can get ahead of what’s going on, then — so now I think the more we talk, the more I understand kind of how you — why you characterize that as foundational.

    That it’s not reactive, feature level. And you haven’t said this, but I feel like when those questions come, sometimes they come late or when you do that intake model, there’s other ways that you could have helped if you were, like you said, reaching out to those stakeholders and talking about what they’re doing and how you can help them. And you’re saying that you’re kind of in the middle now, you’re somewhere in between if embedded in foundational or sort of endpoints, you’re kind of in the middle right now.

    Vanessa: Correct. Because I think at the end of the day what I’m trying to balance for is impact. And so there are some, I guess, areas of the product or even feature level things that we know are very critical for us to get done this year. They’re highly complex. They need someone that is thinking in that space day in and day out. And so there are often one or two researchers that are embedded in those spaces. And then there’s broader strategic questions that maybe you’re not getting directly from the PMs.

    Maybe that’s coming even from senior leadership where they’re thinking about the landscape and where do we need to go and broader, less scope, less defined questions. And so some of that will be, I guess, covered by us or we’ll just create bandwidth for so that we can operate at those different altitudes.

    And then we’ve also kind of launched a bunch of internal programs so that the feature level work that does need to get done can still be supported. So we have office hours. We have rolling research. We encourage designers to do their own research or PMs to do their own research. I know that is often a hot topic in the industry. I think we try our best to make sure all of the things that would benefit from a researcher’s kind of attention actually gets that attention. But then we also try not to gatekeep due to the size of our team. We’re just not able to get to everything.

    Steve: I want to follow up on that, but I want to just go back to the more you’re describing. I’m having another reflection, I guess, because you started off saying that you started off in one mode and then you kind of shifted to another and then you made another shift. And, you know, as people try to ask, like, what’s the right model, you know, to hear how over a fairly short period of time, you know, you have — you’ve iterated or evolved and that it makes me think that the answer — there’s so many — there’s so many it depends on, you know, what model to have, and it depends on your company, depends on your team. But also that these are things that change and that there’s no reason to pick one and stick with it, but to adapt as it sounds like you have to changing conditions and right, you know, in another X amount of time, you might go back to fully embedded because the company’s here or your — I guess other factors like your team or other changes in the strategy might lead you to choose a different model.

    Vanessa: A hundred percent. I think those are some of the deciding variables. Team size. So at one point the team grew to 12 researchers, which is why we were able to embed because we had enough to go around almost. And then when we reduced in size, that meant, okay, we need to find the highest value areas. And then I think, yeah, company strategy, thinking about whether your product is in a place where it’s more stable or if we’re in a place where we have a lot of pressure to innovate. All of those things matter. And kind of I think I took into consideration on how can we work to still make it sustainable for the researchers too. Because of course we could have played the volume game and tried to crank out three to five projects every quarter.

    But I’ve really been emphasizing let’s choose quality over quantity. And we need time for deep work and for thinking, even if that means you’re cutting down to one to two projects for the quarter and you’re spending much more time synthesizing across past work and doing more foundational or longitudinal work. And luckily we’ve been very much supported by our cross-functional partners and leadership because I know for some companies it’s like, no, everything needs to be tested before it ships. There’s many different reasons why you might get blocked from something like that. I think we’ve been able to answer questions and show areas where we should pivot because of how we choose to work and the types of insights and the clarity that that provides
    that we’re continuing to see support and we’re not really being forced to play the volume game.

    Steve: Is there anything you can point to that helped you establish that footing where there is that support?

    Vanessa: I think a project I talk about internally often happened about a year and a half ago where it was my first time to really decide to go rogue a little bit and just grab two people on the team and say this is the area that I think we should investigate and publish research around. And then because I think with research sometimes there’s an ask or there’s like a push model. Like no one asked for this, but we’re telling them anyways. And so I think about a year and a half ago was the first time I did that for a larger scale project and we just found different avenues to communicate the information. Essentially the more we got it in front of the relevant stakeholders and leadership I think the more the problem became clear and people were bought in and then over time we also saw in one particular area like oh the quantitative data starting to support that story too because it was a newly launched product. And so we were almost ahead of the trend in being able to point out here’s some of the challenges we’re going to encounter. And I think just having a few case studies like that helped us prove the value and kind of earn the respect, get the traction for having that flexibility. Of course that could have backfired. It could have not been received well, but I think in that particular instance that to me was almost my personal proof point for we should keep going down this path and like helping also me gain confidence that this is the right path to take the team down.

    Steve: The path in this case is referring to what kind of work and how you’re helping the company.

    Vanessa: Exactly. The path is essentially choosing our own topics to go investigate even if no one’s asking for the work. And I mean there’s still ways like we don’t go away for three to five months and then come back and like ta-da we have something cool to show you. There’s still ways to gain buy-in along the way so we are doing our due diligence by crafting a proposal, shopping it around, seeing who at the company could be a good stakeholder to actually implement some of the changes that we’re suggesting. So I’m not saying just go rogue and hope it works out, but I guess it’s more so again the proactive versus reactive model like really taking ownership and saying we actually have things that we think are really important to go after and then advocating for that and pursuing it.

    Steve: And in that first example, the first Rogue project a year and a half ago, I heard you talk about, you know, finding this area and choosing to spend resources and people’s time to go do it, but then you, I think also are highlighting, communicating that to some group of people to kind of highlight it. Yeah. Is there a way to sort of compare the proportion of effort in the — and I don’t know if it cleanly breaks this way, like doing that research and communicating that research?

    Vanessa: Yeah, I don’t know if there’s a split. I will say for research that other people ask for, the effort and energy to advocate for it is way less. Like if you’re being pulled into something then people are organically just going to show up more, be interested in the findings. So I think there’s a lot more effort and design up front that goes into really making sure that what you do learn is strategically still a place that we can go. Because I think that’s the other thing you want to make sure of, right, is you don’t want to pursue a project and either everyone already knows the information or people didn’t know the information but were already locked into whatever plan when it comes to the product.

    So I think the up-front work is really important and takes a lot more energy as well as the communication of findings because now you have to create your own forums and audience whereas other work just is very organic. Like yeah, you go to the product manager who owns this feature.

    Steve: I’m hearing in your answer, and now the second time you’ve kind of explained this, that my question was flawed. My question was kind of about research and then communication, but you’re really emphasizing, even when it’s a study or especially when it’s a study of your own sort of discovery or advocacy, there’s that upfront due diligence. There’s doing the research and there’s the communicating. But you’re doing all those three pieces and I guess just to repeat what you’re saying, when it’s a rogue-style project, the upfront and the after part are significantly more effort.

    Vanessa: Yeah, I would say so. And I’ve encouraged my team to do this across any project but I think we also do more work to design the artifact ahead of time to think about how this information could be presented or even just have something tangible for someone to react to before we kind of double down and we’re like, again, for foundational work it might be a much higher end than like testing a little feature and so we don’t want to be 15 plus interviews in and realize like, oh, this is not going to work out or people already know this.

    Steve: So what kinds of things — you’re talking about what kind of output you create as a result of the research? What kinds of things might your team create as output?

    Vanessa: Correct. Yeah, I think a lot of the traditional artifacts, so we do create a lot of decks. I think beyond decks we work in Figma a lot so we try to prototype different styles of outputs. So sometimes it might get really visual and we’re trying to bring in more graphs and charts or if we’re doing persona work, designing templates and stuff ahead of time to think about what data do we want to collect along the way. I mentioned we also have a service designer on the team so sometimes that is like a full-blown journey map where we’re bringing in all of the layers. We’re bringing in the product touchpoint and like the external guides and people like touchpoint when they’re talking to a salesperson or AE, account executive.

    So I think we try to remain pretty open, sometimes try to get creative in terms of, you know, what the medium should be. Should it be pre-recorded? Should it be video? But I think, yeah, we really try to do some like content design ahead of time to also use that as part of the conversations when we shop around a project.

    Steve: So to clarify, this is happening in the upfront portion of the project, so you’re thinking about what the output might be in order to have these conversations with people about this research, which you have identified as important, but they haven’t asked for. So what are you showing them? These are sort of — these are samples of what a deliverable might contain, but there’s no — there’s nothing in it because you haven’t done the research yet.

    Vanessa: Yeah, either there’s nothing in it and it’s essentially like a template just to show like placeholders of the type of information we can show. But sometimes again we’re not starting from scratch. Like the reason we’re initiating this project is because we kind of have like sprinkled evidence popping up across the team, but we’ve never deliberately investigated this area. And so part of it is almost like an early synthesis of like we have a hunch because we saw this, we saw this.

    And so sometimes the story just emerges from that already where at high level we can kind of come with an outline of like, okay, these are the group of people. We think they’re experiencing this problem. So of course that’s not like how you do research all of the time, but I think it could be very effective to do that some of the time, especially in an environment where it’s like people are looking to make decisions. If our research is just like, oh, that was cool and people move on, then we didn’t really do our job that well.

    Steve: So it’s interesting. It’s almost like a — I don’t know, like a trailer for a movie or something. Like, here’s the decision you’re going to be able to make. Here’s where we are today. If we do this, then we can fill in these gaps. And so you’re not selling them research. You’re selling them the thing that they care about, the decision they need to make.

    Vanessa: Correct.

    Steve: Way back when I said I wanted to go back to something and then we got into this interesting thread, you said that part of what your team does is, you know, help other cross-functional partners and folks do their own research. And I think you said this is kind of a hot topic right now. But yeah, what’s been effective for you? What have you seen work well?

    Vanessa: That’s a good question because sometimes it still scares me too. I think the things that have worked are especially stakeholders that have previously either had experience with research or have been close to previous work that we’ve done, tend to already have a better sense of like how to go about it. We’ve created templates just around like here’s how to write a study guide. And the ideal version of that would be they take a stab at it and then bring it to something like an office hours or talk to a researcher they know so that we can at least coach them a little bit on like are these the right questions and like is that the right set of people that you need to talk to? How do you recruit properly? Things like that.

    And so I think if all of the setup goes well, I sometimes have like less concern about the actual sessions themselves if they’re able to just follow a script. I think the analysis piece then again becomes a little bit risky in terms of how people analyze, synthesize information. And I think we could probably do more there internally to like guide people through that process because I think it’s just yeah, challenging if you haven’t experienced it often enough or I’ve seen people overgeneralize or grab that quote that supports exactly what they want to push or need to do and kind of ignore the rest. So that’s why it’s definitely mixed, my feelings towards it. But at the same time, I have seen it also be very useful when it’s something we just can’t support or take on and they do have kind of the proper resources and guidance to get to some of those answers themselves.

    Steve: size and availability of your team weren’t an issue, what would you do to help, you know, people who do research be effective with analysis and synthesis?

    Vanessa: Oh, good question. I think I would probably be in the form of a workshop or just live debriefing. Multiple jobs go. We used to do a lot of that as in groups like in person with sticky notes and really go deep on that front. I think now that everyone’s remote, we try to do that more in digital tools. But I think having more guidance around how to approach that or even providing them with a framework or a plan on what notes do you want to capture, how do you properly set that up and capture them, especially the more participants you have. I think that’s where I’ve seen lots of people waste lots of time because they didn’t have a plan. They just went through, captured all the data, and then they’re kind of in a now what situation where it’s like, do I rewatch 20 videos? I’m like, please don’t do that.

    And so I don’t know. I’ve joked with my team this week that I’ve equated research to party planning, but it’s like the more you plan up front on all the things that are going to be happening and go well and to have a schedule around it and have a plan for how to get to the end, I think the smoother it goes.

    Steve: I really like party planning as a kind of a framing. And so yeah, analysis and synthesis is part of the party plan.

    Vanessa: Yep.

    Steve: I think it’s interesting to hear you and this is not meant to be presented as a disagreement or not, just a reflection that when you think about kind of how to help people move forward with that are not familiar with analysis and synthesis, you’re talking about the, I don’t know, like the tools and tactics of managing that data. That’s I think what I heard you kind of emphasize. I like that because those are things that can be described and be enacted. But there is this part to me of analysis and synthesis that feels, it scares me to, even though I do train people to do this, it still scares me because it feels like it’s creative. I’m even like sheepishly using that word and speaking to you, but it feels like it’s creative and a little bit magic and a little bit hard to describe.

    Vanessa: 100%, which is why I still say it’s also the scariest part for me to let people go on that part of the journey. So I don’t know. Like when I was first entering research, I think a lot of things helped structure my thinking, at least, around the difference between hearing something verbatim versus having your own interpretation of it and kind of going back and trying to be a little bit more rigorous around what did you hear, what does it mean, and how do you navigate that? But I love hearing you talk about that it is a creative process because I think, yes, once all of the kind of analysis is done and the data is on the page, understanding what’s important, what story you want to tell, and how to put it all back together is, I think, an extremely creative process because regurgitating everything you heard is not going to work. Like you have to make it compelling and you have to find a point.

    And I actually think that’s the thing that most junior folks struggle with is they want to share everything they learned. And like what are your top three things? Because in reality, we can only action probably one to two, if anything. So I’m a huge fan of pushing for prescriptive findings and having ideas around what should happen next.

    Steve: Can you explain prescriptive findings?

    Vanessa: Yeah. I think so. Rather than having a recommendation like XYZ needs to be clearer on this page or like users were confused by X, which is a little bit more just like describing what happened and where the problem lies, telling the rest of the team, here’s how I think we should solve it. Like, are you saying you should write something differently? Are you saying a human needs to help them? Are we going to introduce AI because that makes it better for them? Like, I think there’s so many different ways to kind of get into solutioning. And I mean, maybe one could argue by just presenting the problem, you’re leaving that solutioning piece open.

    And in that case, I would say follow up with a workshop and get a room full of people together to do that in a more deliberate way. But if you’re limiting yourself to a readout and you have reasonable confidence that you actually know the next step forward, I ask people to just say it rather than hold back and just try to be more neutral.

    Steve: Yeah.

    Vanessa: Yeah, that’s kind of where I land on, being prescriptive. Curious if that resonates with you.

    Steve: Yeah.

    Vanessa: [Laughter]

    Steve: I mean, I think that, you know, you’re kind of getting at what are some of the hot topics in research and I think do researchers give recommendations is a hot topic. At least for me, it feels something I’m sensitive about. But you’re providing some nuance here and you’re kind of giving some of the, it depends. And I think I’m hearing you saying like, if it’s clear what the thing is like what the solution is right.

    Then, then yes, there’s no reason to hold back on that. I think where I, if I were to think about my own practice. I would, I agree with you. People are confused by X is, it’s just a description. But I, and again, we’re speaking very about generalities here but I think what I try to do is say, people are confused by X, because they understand this word to mean this and you’re using it in a way that means that.

    Vanessa: Yeah.

    Steve: So, you know, you can go further and say like if you want people to understand this, you know, the language has to line up. And I think that’s very different than change the label on this button from A to B so people know how to use X, I tend to not. And this is also, I’m a consultant I don’t have the same relationship with the product team that your folks do.

    Vanessa: Yeah.

    Steve: And I might be more likely to like, to like, give them the whole to decompose it a lot so that they understand, and maybe what to do is obvious. But I think I don’t ever know enough to say, I shouldn’t say ever I often don’t know enough or what all the possible solutions are what the roots of that are. And I want to sort of hand them. You know that thing about helping somebody telling somebody what to decide but making it feel like they’re the ones that are deciding it.

    Vanessa: Yeah.

    Steve: You know that’s easier for consultant or that’s maybe more appropriate for an outside person for in house in house person.

    Vanessa: I love that, though. I think I’ve actually done a mix. Like, back when I was still doing a lot of research myself, I would probably initially land at kind of the fidelity that you’re talking about, of like being very specific about what’s wrong and how they’re misunderstanding it, and even the prevalence of the problem and just really making it concrete of, where are people getting hung up? And then I would just use a little idea, you know, bubble icon, and then separate that and be like, “What idea is this?” So that I’m still getting it in there, but they can at least anchor on, “Okay, this is the finding, and then I’m taking it one step further, but they’re not so tightly coupled.” Because I think if I would jump straight into, “Hey, you need to do this, this, this, and this,” and they don’t have the context behind it, one, it loses credibility, and then two, if I was off, now they don’t have the anchoring problem to actually address it differently. So I kind of like the combo.

    Steve: I really like the putting the recommendation or the suggestion in a separate area the, the idea bulb, or the light bulb kind of call out. Because I, you know, I think sometimes what I’m trying to activate is to get them thinking about that transition between research and action like, and that there are, you know, I love your example of the workshop and we can always get the workshop so can we at least put this forward as, for instance, for instance, if we did this this would we’re not saying this is the only way but for instance this would. This is a way that you could use the tools of design or copy or whatever to solve this problem. I mean, I had an experience a year or so ago where, you know, I put some for instances into something for a client that I really wanted them to riff on what was possible because I just didn’t know what was possible.
    And I was really torn between like, you know, as a researcher if somebody build something based on what you’ve recommended like, that’s a tremendous win. So I was really proud of that and, but also, I wasn’t right, it wasn’t a recommendation go to x it was like start thinking about solutions in this area. So I had this sort of mixed reaction, you know, in that experience.

    Vanessa: That’s understandable, but I agree. I think that’s always like, counted as a success of like, “Hey, my idea made it into the product.” And ideally, due to our background and our proximity to customers, it’s like a pretty legitimate idea. It’s not like we just pulled it out of thin air. So I like that.

    Steve: Maybe we’ll switch topics a little bit. You’ve, you talked a little bit about, you know, you’re just talking now about some ways that you practice in previous roles. You know, now you’re in a leadership position and thinking about right driving you’re advocating for the practice. But if we could rewind, however far back we want to go but you know, can you talk a little bit about, you know, what was your path to get to the role that you’re in now.

    Vanessa: Yeah, my path was definitely not a conventional one, but I think we are in a field where that is often the case. So I think my first real entry into, I guess, being in a tech environment was after I quit my personal training job. So I had a anthropology degree from Berkeley as my undergrad degree, and I did not know what to do with that. I would go to job interviews, and they barely knew what that meant. Someone asked me if that means studying dinosaurs, and I was like, “Not quite. It’s actually the study of people and cultures.” So I struggled quite a bit after college and kind of had like this passion for fitness on the outside, was a personal trainer, and did not love the working hours and how that schedule tends to play out because you have to train when everyone else is not working.

    And so I found a temp agency that had this office manager job at a startup called BrightRoll, and within a few months of me being hired by Yahoo. So that was my first exposure on talking to people that were working in product and UX and marketing because I had ambitions of other things but I just did not know what I wanted to do. And then fast forward, I actually worked at another company that got acquired by Capital One, and I was still kind of in an office manager role and then shifted to an administrative role. So I ended up becoming the executive assistant to a product VP and that was really where I figured out that UX is what I wanted to do because I essentially told them I wanted to learn the business. I wanted to be close to the work that was happening and they had a very heavy design thinking culture. So I was participating in trainings and design sprints and home visits and just like really being immersed in UX and decided that research felt a little bit more aligned with my undergrad experience kind of in anthropology.

    And so about a year after doing that, still at Capital One, I was able to connect with a few folks in UX and get a research operation job. I wanted to jump straight to research but the head of the department said you need a master’s degree for that. So I ended up doing a lot of research operations which gave me a lot of proximity to research and pursuing my master’s at the same time. So I was, you know, doing a lot of the recruiting, writing screeners, managing our tools, like procuring new tools. So really it was like all of the research. I almost felt like a research assistant in many ways while I was getting my master’s in human factors and information design and then from there once I did have my degree I formally became a researcher at Capital One. So there I was working a little bit on the mobile app and then on the website Capital One dot com. So that was like a really fun time in my career and then after when I ended up at Google I switched very quickly from being an IC to kind of first managing a few contractors to managing a qual team to also managing folks that were quant researchers to now my role kind of expanding even beyond research.

    So it all kind of fell in place very, very quickly as I yeah, kind of navigated my career and didn’t spend as many years doing IC work as I expected. Like a lot of the time when I was at Google I was kind of in a hybrid role so I was still conducting my own research, managing a few vendors that were doing kind of consulting projects for us and then had a team.

    Steve: How did you learn the managing part you kind of moved into that and that’s a different skill set than what you had been got your master’s degree and I’m assuming. How did you learn that.

    Vanessa: Yeah, I think very, very different skill set. But I think I was gravitated towards that. Like even when I was younger I think kind of leadership came natural to me like in high school I was like president of this club or captain of the volleyball team so I always kind of gravitated towards that and I was a tutor in high school for many, many years. So I think there was always the spirit of like helping people, coaching, collaborating that came natural to me and then on top of that I think on the job training of just like understanding how to navigate different situations.

    I mean I don’t think I could have ever imagined the situations you find yourself in because of course there’s performance related stuff. There’s stakeholder related stuff, politics, and then there’s personal stuff and it all kind of comes together and then there’s the actual like looking at your team as a whole are we actually operating and performing in a way that is like helpful to the individual but also to the company and so I feel like there’s a lot of different variables to juggle, but I would say I kind of picked it up from like observing my former managers, realizing what wasn’t working for me, and then just like workshops, trainings, and on the job experience.

    Steve: How did you distinguish between management and leadership were sort of using both those words in this conversation I wonder, do you have a definition or an explanation for either.

    Vanessa: I mean I guess I tend to agree with like the ones you typically hear where it’s like anyone can lead or be a leader and then management is kind of described more as like a distinct set of facilities, so right now I’m using them pretty interchangeably but I know there are formal distinctions on how that works.

    Steve: When you and I talked in anticipation of having this conversation. One thing that you brought up was both an inclusive research practice and things like inclusive hiring and I guess inclusive as a big term here, but it seemed like you had had some experience.

    I had some experience with that and some perspective on that, not just a Twilio but I think throughout your career and I’d love to hear you illuminate some of what you’ve done and how you approach it.

    Vanessa: Yeah. I’m glad we’re touching on this topic. I think kind of inclusive research, workplace, all the things has always been really important to me. Unfortunately I think in 2024 we still see lots of forms of discrimination whether intentional or not. I think it’s a big conversation again now that all of the companies are thinking about AI and thinking about okay how are these data sets, like where are they coming from, like how are models being trained, and a lot of it is always looking back, and so I think just throughout my career I have been very mindful of the folks that are marginalized whether it’s like in the workplace or as consumers of certain products that you know sometimes things are not designed with women in mind or people of color in mind or certain disabilities are not thought about which often leads to problems for again the people who are supposed to be building these products for but then also I think in the workplace like I have seen these things manifest and I think these things go hand in hand as people from different backgrounds or different walks of life. I think there has been so many studies now published on you know having a more diverse team actually leads to more creativity and better solutions because you’re able to see things from different angles and bringing all those perspectives together kind of creates a better outcome and so yeah that’s just been something that has been important to me throughout my career, that I have been mindful of with any of the teams that I have been on that we take that to heart and also design our research to reflect that.

    Steve: So when you say design your research, that makes me think about sampling but that may be a very small view on what you’re talking about, how does it show up when designing a project?

    Vanessa: Yeah, I definitely think sampling is a big part of it. I think where tech companies are located it’s easy to do convenience sampling and just say you know we will do something here in a lab. Luckily now remote research is more popular but back in the day where like a lot of physical labs were being used to you know test an app or show a new site that you just don’t want out there on the internet. That often meant you know you are sampling a geographic area where the income might be higher or there’s a certain distribution like whether it’s a gender ratio or a race ratio that might not reflect the full population. I think being very U.S. centric is often a thing with companies and research that we do and there’s a lot of additional hurdles sometimes to be inclusive. Like if you want to do a study that is conducted in people’s native language and you’re an international company there goes many dollars for recruiting and translation and kind of the scale that it now takes rather than doing something that’s local in your area. So that’s kind of what I mean by like the design, but then it can also mean you know all the way down to who is actually conducting the research. Again like if it has to be a native speaker or if it’s better with sensitive populations who might have mistrust when it comes to testing medical products or things like that like it all then requires a different level of planning and consideration in order to also make the participants feel comfortable and to collect that data in a way that’s respectful and yeah I mean I think I could go on and on. That’s kind of what comes to mind for me and what I have kind of pushed for and I think for the most part there hasn’t been any push back on whether it’s the right thing to do per se, but it really comes down to then the timing constraints and the financial constraints that push a lot of companies to say you know what? Not right now. So I have been a pretty strong advocate for when it makes sense to let’s take that time and kind of broaden our scope in order to make sure it works for the broader set of people, not just the convenient sample.

    Steve: And you made a point about right diverse teams are more creative and then even thinking about who does the research.

    Vanessa: Yeah, definitely. I think right now I’m in a context where we’re actually B2B and so I would say some of those factors have been a little bit less prominent for us and we tend to you know kind of go for the people who are using the product and try to just like expand our sample to that, but I’m sure that’s your experience too with B2B often you just have a way smaller population than in consumer where you have often hundreds of thousands of people you can contact like B2B might be. You have a list of 5,000 and by the time you add a few criteria now you have like 30 people that qualify. So that makes it a little tougher, but I think in the past when I’ve worked in SMB spaces or consumer type products then that has been more of a consideration and I think that’s also where vendors have sometimes come in for us where maybe we’re not the appropriate set of people to have this conversation or we might miss some of the nuance that exists in this conversation, so we’ve like when I was at Google we definitely would get outside help to kind of round out some of those conversations or kind of share like the interviews across a native Spanish speaker if we don’t have that on the team.

    Steve: Does this come into how you approach hiring.

    Vanessa: That’s a good question because I think with hiring I’ve not deliberately been in a situation where I’m like saying we need this race or this gender or this language because I think that can get very tricky and also potentially lead to discrimination in other ways, so hiring has largely been like merit based, but I think the part of hiring that I still find problematic is often people talk about pipeline issues and not being able to source candidates from different backgrounds and I often find like that there is actually something we can do about that and I think that’s where I focused more efforts is if I’m seeing candidates come through that are a little bit more narrow in kind of let’s say education. They’re all coming from a set of Ivy League schools or they’re kind of like located in a specific part of the country then I have kind of like worked with recruiting to say hey, can we kind of reach different hiring pools? Do you have access to like different communities where you can plug these jobs to kind of diversify the pipeline so we can make sure that at least candidates that have different levels of education or different background can at least have their resumes reviewed, be interviewed, things like that.

    Steve: You know, in the time that you spent in your career so far. Have we made progress on these issues like how, how do you contrast what you see 2024. Like you said there’s still discrimination is not gone away.

    Vanessa: Yeah. It’s hard to say. I’m not sure if we’ve made progress or not. I think during the pandemic when George Floyd was murdered there was definitely a heightened interest from companies to think about a lot of these topics. So one way that worked out is a lot of companies including Google and I was actually participating in like some of the hiring initiatives was to think about product inclusion specific roles and like really carve out space to say like we have people in the company who are kind of like trained and qualified to deal with maybe like more sensitive populations and then who are also actively putting programs together to like teach the broader company like how do you make sure that your product especially again when it comes to like language or anything AI related or kind of like anything that’s like visual is not discriminating against like oh the photos can’t pick up on darker skin or they can’t understand people with a certain dialect. Like how can we get that education out there? So I have seen a push in that. There was more consulting firms, more job postings that were specifically around product inclusion and bringing that academic knowledge into the tech field, but as far as like broader hiring trends I don’t know the data well enough across the full population, but I don’t know if I have seen like a uptick per se.

    Steve: So maybe one last area to loop back to something else that we were talking about it and you described, you know, going from personal trader to attempt job that gave you exposure to, you know, things that led you into you accidentally research. In some ways it seems, I don’t know sort of circumstantial or opportunistic or something like it’s hard, it’s hard to plan for creating the conditions for that. And that’s, you know, what, is there any lesson or, you know, advice to people that that are listening or that you come across in your in your travels anyway about, you know, going from to me they seem like very different worlds to go to going from one bridging from one world into the other, you know, for you is, it was, I think a certain amount of happenstance and having the, you know, having the right lens on it. But I know does this lead to any guidance or advice for other people from your experience.

    Vanessa: I think for many years, I had anxiety and regret around not starting my career in the field that I’m in sooner because I felt very very lost stumbling through all of the different fields and roles, and only in hindsight do the dots connect. As you were saying, they seem very, very different, but I think I’m better at what I do now because I learned the lessons in all of the different jobs. Even something like, again, being a personal trainer or executive assistant, being an executive assistant, I was able to sit in on more senior leadership meetings, and I really early picked up on short attention span, How do you get your point across concisely, What do they care about? And I think that made me a better researcher right away, even as I was still learning the practice because it taught me something about communication.

    Then even reaching back to personal training, I think that made me a better manager because I like to think of myself very much as a peer to my team, where we are thought partners. They come to me with things, I come to them with things, and because they are all so driven, we don’t really have a lot of issues where I just have to enforce things or tell anyone to do anything.

    But circling back to the personal training piece, I think that really put me in a headspace of okay, someone has a goal and they’re struggling with this, or they need support through this. How do you tap into the human psychology of – they want to get from A to B, we want to make it as sustainable as possible, do something that hopefully they enjoy enough that they can do it on their own.

    And so really just getting into that mindset of how do I collaborate with another person and find common ways to address a problem and align with them so it doesn’t feel like I’m pushing you or forcing you, we’re kind of going towards this goal together. I think all of those little pieces along the way just shaped how I interact with people and I think has made me better at what I do today. Maybe just know that it’s all connected somehow.

    Steve: Yeah. Yeah, I want to normalize part of what you said at least that, yeah, I also had regrets for not starting my career earlier I felt like I was late. And, you know, I’m, but that was long enough ago that many people I know didn’t know me then or wouldn’t know that about me but, you know, I had a complexly directed path to do what I do now. And, yeah, I think you’re right that we’re sort of all products of all of our experiences is one of the things I like about research but I think you’re right to extend it towards management and everything that there are a lot of different paths and then there are a lot of different ways of being. And that, yeah, if I wasn’t who I was I wouldn’t have gone on those paths and I wouldn’t be able to be all the weird mixes of things that makes me whatever kind of researcher or leader or, you know, all the things that you’re kind of bringing up. I like that about, I’ve always enjoyed that about our field that we’re just, we all come from different places. Even over the generations I think there’s still an interesting mix of that and then means like oh the next researcher that you meet, you know, done something that you didn’t know about like, and you get to work with them and like, you know, there’s something about personal training that shows up in an analysis or in a planning meeting or something and you get all these great stories from people. So I enjoy that. But yeah, I just, I started off by talking about myself here that I have, I know what that regret for me is like that I didn’t come into it out the gate early on and had to play catch up in a lot of ways too.

    And maybe that’s a great place to kind of wrap up our conversation. Thank you so much for taking the time, sharing your own experiences and your own perspectives on the work that you’ve been doing all the way along and yeah it’s great to get to chat with you. Thank you.

    Vanessa: Thank you so much for having me. it’s really an honor. And even when I was telling folks that Dollars to Donuts was coming back on my team everyone was really excited because a lot of us have your books and have listened to you and learned from you so I really appreciate being here.

    Steve: All right, well, great. I hope they enjoy listening to you talk about all the great work that you’re doing.

    Vanessa: Thanks.

    Steve: Thank you. All right, that’s it for today. I really appreciate you listening to this episode. If you like Dollars to Donuts, recommend it to a friend or colleague or post about it on the social medias. You can find Dollars to Donuts in most of the places that you find podcasts. Review the show on Apple podcasts and go to portigal.com/podcast to find all the episodes including show notes and transcripts. our theme music is by Bruce Todd.

    The post 38. Vanessa Whatley of Twilio first appeared on Portigal Consulting.
    28 February 2024, 5:54 pm
  • 1 hour 7 minutes
    37. Nizar Saqqar of Snowflake

    This episode of Dollars to Donuts features my interview with Nizar Saqqar, the Head of User Research at Snowflake.

    For a domain that takes a lot of pride and empathy and how we can represent the end user, there’s a component that sometimes gets overshadowed, which is the empathy with cross-functional partners. With every domain, product design, research, there’s people that are better at their job than others. I dobelieve that everybody comes from a good place. Everybody’s trying to do their best work. And if we have some empathy to what their constraints, what they’re going through, what their success criteria is, how they’re being measured and what pressures they’re under, it makes it much, much easier for them to want to seek the help of a researcher to say, “Help me get out of this. Let’s work together and let me use research for those goals that are shared.” – Nizar Saqqar

    Show Links

    Help other people find Dollars to Donuts by leaving a review on Apple Podcasts.

    Transcript

    Steve Portigal: Welcome to Dollars to Donuts, the podcast where I talk with the people who lead user research in their organization. Today’s guest, Nizar Saqqar, actually brings this up in our conversation, but I’m going to remind you myself that there is a new edition of my classic book, Interviewing Users. It’s now available 10 years after the first edition came out. Of course I’m biased, but I highly recommend it. And hey, it makes a great gift for the researcher or research-adjacent person in your life, or persons. If you haven’t got the second edition yet, you can use the offer code “donuts” that’s D-O-N-U-T-S for a limited time to get a 10% discount from Rosenfeld Media.

    But now, let’s get to my conversation with Nizar Saqqar. He’s the head of user research at Snowflake. Well, Nizar, thank you for coming on the podcast. It’s great to get to chat with you.

    Nizar Saqqar: Thank you for having me. I’m really excited for it.

    Steve: Let’s start with an introduction from you. You want to say a little bit about your role, your context, anything to kind of get us rolling? And we’ll go from there.

    Nizar: Absolutely. I’m Nizar. I lead user research at Snowflake. I’ve been here for about three years. It’s been a pretty exciting adventure. When I started first researcher at a company that’s been around for 10 years and really doubling down on showcasing the impact of research, why we need to scale, and we’ve been scaling nonstop in today’s environment, which has been a pretty exciting challenge. It comes with the fun of it, but comes with the challenges as well, and I think more to come. To take a step back and try to simplify it as much as I can,

    Steve: What kind of company is Snowflake?

    Nizar: Snowflake enables organizations to store huge amounts of data from many sources in one place. So it empowers organizations to make the most out of that data. And as we’ve scaled the company, we’re continuing to push the envelope on platform-level offerings that try to enable native app developers to do the development of data applications.

    Steve: What are some examples or vertical scenarios that we might know about?

    Nizar: So the simplification of it is get your data in a place, make the most out of it. Snowflake will help companies do that as efficiently as possible with as many use cases as we can. It’s definitely not in the day-to-day conversation.

    Steve: What are some examples of what Snowflake is?

    Nizar: It’s not a B2C product, but at the core of it, it really starts with the data warehousing of getting the data engineer brings in all of the data from many different places, many different sources into one place for storage, and then making it usable for other users, maybe like the data analyst or the data scientist who makes something happen out of it as an outcome. And as I mentioned earlier, building the native app development framework, it’s been exciting to see all of the, think of them as the more classical kind of software developers that are now into our ecosystems to get closer to the data. So it’s a pretty complex ecosystem.

    We also have a marketplace that then kind of introduces the dynamic of a provider and a consumer and the business decision makers who are coming in for that transaction. So it’s a pretty intense ecosystem that magically all connects into just making the most out of your data.

    Steve: So when you came in as a researcher, what did you observe about how this company was thinking about its users or thinking about what it knew or didn’t know? Do you remember that early process?

    Nizar: Yeah, and to be honest, that process started before I even started. Even in the interview process, I really wanted to be sure that the company is thinking a lot about their users. They’re thinking a lot about how research can integrate. They’re thinking about challenging some of maybe the perceptions of what research can bring to the table and just having some of these tough conversations even before. And I will say that where we are definitely lucky is that Snowflake does put, do what we can to make it as great of a product for our users as like a core value of the company.

    The interesting thing with that is that it brings in a lot of data points from a lot of pieces. Now you have a lot of user perspectives from the sales team, directly from the product team. Then you have all of the metrics and dashboards that you’re following. So you actually get a lot of, you get a lot of data. You get a lot of points that might actually make it harder for the product teams to action on or prioritize.

    So as I started, I kind of wanted to first take a moment to better understand the domain, really kind of find my footing, know what’s going on, build the right relationships and start with something that’s very low-hanging fruit of saying, “Hey, let me just build credibility. Let me just come in and say I can add value very quickly and then scale that up.” It’s been interesting to see how the role has continued to evolve since that day one. It really started off with, we’ve been here for 10 years. This guy is here. And it’s really kind of evolving into user research is just a critical component of how we think about product development.

    But it’s taken many phases that we’ve had to adapt as we continue to go, starting off with the very tactical, then zooming out into something that perhaps is more strategic, then shifting focus into our hiring strategy and our hiring rubrics and how we interview, going all the way into what we define as success criteria, performance evaluation and how we integrate research into the overall product process. And it just doesn’t stop. So the role itself has been changing over the past three years and I perceive it to continue to do that.

    Steve: Can you give an example of a tactical, sort of quick win that you would approach kind of coming in, in those early days?

    Nizar: Yeah, absolutely. And I think for me it really starts with what is something that is tactical enough, close enough to wanting to launch. There’s enough resourcing, but there’s some level of disagreement in the organization how to proceed. And seeing if there’s an appetite to make research kind of be a tiebreaker of sort or really find the right balance between the two. I found that it’s very rare that there’s option A or option B wins and there’s always like components of each that kind of resonate that when you bring them together, you kind of find something that really resonates to the actual flows.

    And if I remember correctly, one of my very, very, very early contributions was kind of around something as simple as a concept evaluation. And I think those are the methods that are just going back to the basics and some people take for granted, but you’re just coming in and you’re saying, “Let’s just test it and see what’s resonating and what’s not.” And coming up with some actionable steps that align multiple teams that might have dependencies on each other to find the solution that may not make everyone happy, but at least everyone is aligned that, “Okay, this seems to align on a path forward from a user lens.” So then it just continues to evolve.

    I was talking recently about how coming off the bat, just seeing that there’s an overwhelmed product manager who says, “Hey, I have 20 features that I’m asked to ship.” And my role there was to come in and say, “Let me help you do a MaxDiff survey to just make a case for some things that you should actually deprioritize so you can make progress towards some of the top features that you want to run through.” And I think that was part of the evolution of, “Okay, we could use research for many different use cases and in different areas where we could integrate with the product roadmap.”

    Steve: So I think it’s super interesting that you’re using the interview process, I guess, to understand the context that you’d be coming into.

    Yeah, I’m wondering, and I don’t know, I’m going to ask it like a binary, obviously it’s not, but, you know, how much of a mandate were you given versus how much you were trying to figure out what the needs were and, you know, make recommendations appropriately? That’s a terribly leading question.

    Nizar: It’s not, though. It’s kind of interesting, because there’s– when someone opens a position, when somebody asks for a headcount, for the most part, they have an idea of what they’re looking for. They have an idea of what they think the success criteria is. In my case, I was hired by a head of design who had an idea that I could help elevate the design team. That was kind of like the primary premise. And the interview process, that comes out, and it’s a really exciting thing. And a head of design is really excited to be like, “Now I get to finally get to support design with research.” I started digging into the appetite of, “But how do we kind of expand beyond the design?” If we’re to look at the pie and we’re to say, “Instead of making that piece more efficient, how do we just make the pie bigger? How do we get the design team holistically more involved and have that impact from earlier stages as well?”

    So look beyond the design research component and don’t worry about it. I’ll make sure you have a good story for where your organization is growing in terms of impact beyond the pixels. And I think that was a really good back and forth that early on showcased that there’s a lot of action and appetite for, “Hey, if you can define something outside of what I have in mind that you perceive could be even more value-adding for the organization, that’s what we’re optimizing for.” And I think that was a good start of saying, “Okay, I won’t be in a situation where somebody comes in and says, ‘This is what you need to do.'”

    I do hear a lot of stories of, “The researcher comes in and all you can do is one-week sprint and stuff, just do a study every week, and it’s non-negotiable.” And it was pretty important to gauge that, “Hey, can we just align on value for users and value for the company and value for the team as the criteria, and let me do what I need to do without a specific framework of how I should be operating within those objectives?”

    Steve: I love the phrase impact beyond the pixels. That’s like, that’s a pull quote or that should be a title of your next talk. So that sets you up then to find that overwhelmed PM. And if I understand it correctly, you’re kind of saying to them, like, did you know, hey, this is a situation you’re in, like, here’s an approach that would help you. That’s kind of where my, maybe where my mandate versus discovery question comes from. It sounds like you are finding opportunities or finding places to have impact where that PM is not going to ask you, hey, can you do a MaxDiff survey? You’re coming in, seeing the situation saying, yeah, here’s a way that research can unblock you.

    Nizar: This was a fascinating story altogether, and there’s some more context behind it, which is kind of funny when you look at it. That PM was super excited before I started, messaged me before I started, started telling everybody that me joining is going to be a game changer, was the friendliest PM I’ve ever met when I started. And then back then, my manager was saying, “Hey, we think this is the most ambiguous thing. We need to redefine a roadmap. You need to put a really found– I think this is a really foundational research problem.” So when I went to the PM and I told him, “Hey, we can work together on this,” and I’m actually excited to team up, he actually said, “No, I’m not interested.” And I told him, “Let’s take a step back and let’s speak about why you’re not interested, what’s–just what’s on your mind. Let’s not talk about the research. Let’s talk about the problem you’re solving.” And his take was, from my experience with research in the past, a lot of the times it does take a lot of attention to keep up with all that’s happening, be part of the interviews, and then you come back with a lot of insights that I frankly don’t have resources to do anything with.

    So if you come to me and you say, “Here’s 10 things you need to build,” I’m just going to put it at the end of the JIRA board as items 21 through 30, and I’m not going to get to them. So the key learning for me back then was, okay, everybody kind of perceives my role and how I can solve this problem very differently, and I really need to set some shared language and shared expectations of why I’m here. So that’s when I was like, “Hey, how about we do this? How about I’m just going to go into your JIRA board? I’m just going to steal the things that you have there. I don’t need you to be involved, and let’s make a case of why you don’t need to pursue all of these features at once. And let me do the heavy lifting. We’ll team up on it.” And then going back to my manager back then and saying, “I don’t think I need a multi-month, huge effort to start. Let’s just kind of help him get out of the weeds for a bit and just aligning the expectations over what we can do with the research.”

    And in this case, it was a core example of research really used to de-scope, to de-prioritize, to say that not everything is equally important. At a high level, when you take single-off, one-off stories, they all come up as high needs, but are they all at the same level of importance for when you look at our user needs and the business value that they bring? And that’s essentially what came out of that Maxx a lot of these are way below when you compare it to what’s really bubbling up to the top. And how do I make the case that doesn’t say everything is there for a reason? But let’s make a case that with the limited engineering resources that we have, we can drive the most value if we really focus most of it on those very specific areas and get those to a place where our end users are really happy with the experience that we’re offering. And that was a different mindset. That was a different principle for that PM where it’s like, I didn’t know that we could do that. I didn’t know we could do research that helps me tell a story to executives of why I shouldn’t be doing work or why I should say no to some of the work that’s coming up.

    And then that led to me really wanting to define some of the languages used around why research exists and why research is at that company. And the wording that I tend to use, which may not apply for everyone, but I try to take it around driving the allocation of limited resources into the most impactful efforts for our users or organizations. And if we can have that be a shared language mandate for what research is optimizing for, it takes away some of the misconceptions here and there and where there comes some of the tactical things like changing the title names from UX researcher to user researcher or changing some of the way that we present decks or reports or internal documentation. So there’s some tactical things that come with it, but at the core of it really linking it to the intersection of user value and organizational value.

    Steve: So to get somebody unstuck, when we get so overwhelmed, we can’t even see our way out of something. I don’t have time to do your solution. I’m just, you know, I’m treading water here. So I love that aspect of the story that you found an approach that also is about limited resources and that took that person where they were at. And I don’t hear complaining about a stakeholder that wouldn’t commit to the project, you found an approach that you had a 10,000 foot view and it could kind of see how you could add value. And still, I think you were fairly new to the organization at that point. Is that right?

    Nizar: I almost had no idea what was going on. I needed to rely on them to make sure that the items in my MaxDiff actually made sense. Even when I was mentioning it earlier, and I come from a B2C company. I come from a place where that ecosystem was extremely new to me. So of course there was some collaboration there, but I tried to keep it as lightweight as possible as I make sure that I have the right pieces but without overwhelming them.

    And I love what you’re saying. I love the way you’re describing it. For a domain that takes a lot of pride and empathy and how we can represent the end user, there’s a component that sometimes gets overshadowed, which is the empathy with cross-functional partners. With every domain, product design, research, there’s people that are better at their job than others. Sure, for the most part, I do believe that everybody comes from a good place. Everybody’s trying to do their best work. And if we have some empathy to what their constraints, what they’re going through, what their success criteria is, to be honest, how they’re being measured and what pressures they’re under, it makes it much, much easier for them to want to seek the help of a researcher to say, “Help me get out of this. Let’s work together and let me use research for those goals that are shared.” And at the end of the day, it is still user-driven. It’s still based on the data that we’re getting and we’re able to drive direction. Finding ways to go with the flow while still having a strong perspective over what’s best for the users, rather than feeling that the role of research is always to be on the opposing end of cross-functional partners, could be a really powerful tool. And in these cases, all it leads to is that intersection of product impact and user impact, which I think is the end goal.

    Steve: It is a great story that this person was enthusiastic for you and reaching out and was excited about research. And when you think about what opposition is sometimes, I think it’s easy to sort of demonize someone and say, well, they don’t, they don’t get it. They don’t believe in research. They don’t like me, whatever kind of escalate. And here you started with the best possible out the gate dynamic with this other person. They were a fan. They were like welcoming you and they couldn’t wait. And still they had a concern. And so by identifying that and coming up with the right approach that suited all those constraints, you get to them the kind of impact that you’re looking to have.

    Nizar: And it kind of makes sense. I mean, if you really think about it, the definition of what a designer does, the definition of what an engineer does, for the most part, is pretty material. You finish your effort, you pass it on. Eventually, the thing that that designer or engineer touched is the product that you end up using. What that does is that for areas like research, there’s more fluidity, and the perception of what you’re here for. So that fluidity could be a great thing and could be an awful thing, because at the end of the day, it opens up a lot of opportunity to set the expectation of why the researcher is here.

    But it takes a lot of work to get people to align, because they’re also basing it on their past experiences, basing it on their biases, basing it on whatever good experiences they had, but also whatever bad experiences they’ve had with research, and a lot of that bias of how the work may or may not be precisely presented to what the end user sees just opens a lot of these gaps and unknowns that I think just plugging these holes and making sure that the narrative is clear around why the researcher is coming in, to me, I see it as an opportunity. It’s not always the most fun process to cover some of these holes and make sure that there’s no gap in the perception of why the researcher is here.

    Steve: We started with the foundational, this tactical stuff that you’re doing, but maybe we can look at the whole arc of creating that more evolved understanding in the organization across all these folks about what research is here to do.

    Nizar: Building up on that first story that I was saying, I made my success criteria be less about the research output and more about how’s the research being used, how’s the research actually integrated directly to the roadmaps, and some of that lives till today with the team. You’ll see a lot more emphasis on how often is your research referenced in a cross-functional document. Then you’ll see about what is the quality of your report, for example, as some of the success criteria that we have.

    But taking it back to that initial journey, there’s a disadvantage and advantage of being the only researcher back then. The disadvantage is that it’s overwhelming. There’s a lot to cover. The advantage is that gives you the ability to say, “I’m not going to do it all. I can’t do it all.” And you get to pick and choose a bit in terms of where you can foresee some of the most opportunity is.

    And at the same time, you look at where some of the path of least resistance could exist, too. So if there was a huge problem where the path of resistance is pretty significant, the question that I need to ask myself is, is this where I want to continue proceeding? Should I continue butting heads to be included there, or should I go find that place that is a bit more welcoming to changing their processes and their approach and just kind of use that as a case study? Once that case study lands, how are we showcasing that case study?

    And I’ve never been a fan of visibility for the sake of visibility, but especially in the earlier days of research, there’s a lot of advantages for visibility as a case study of how research could work to empower those around. And that became key. And basically what happened right off the bat is we started hearing the sentence, “Well, I want research. Where’s my research? Why don’t I have a researcher?” And the demand for research support started coming more organically from the cross-functional teams. So it wasn’t on me to necessarily say, “I need people. I need to grow the team. I need to do this.” It wasn’t an ask from the research department to grow. It was an ask from cross-functional partners who have seen how much more effective and how much more efficient they could be with the appropriate level of research support. And that just kind of creates more of that shared language, the shared narrative of what the organization is looking to do with the research and work closely with it, but also what’s the success criteria for the research team.

    And as we started to scale, it became more and more important to set pretty stable goalposts to gauge what success looks like and what our objectives are and being very intentional about kind of not falling into the trap of making research the end goal where you’re out of the loop of what decisions are actually being made and you’re trying to do like a one-size-fits-all approach to research that says, as you get more senior, your research gets more complex, which I don’t believe is the best definition of researcher seniority, but really kind of anchoring it in how we’re able to continue driving the product roadmap forward. Even then, there’s a lot of back and forth that goes into it.

    Steve: When you started to get these requests from people, we want research, where’s my research, the kinds of things that folks were hoping for or asking for, did that line up with what you would want to or hope to support them with?

    Nizar: When you’re starting the team from scratch, the default is actually about, hey, since you only have one researcher, there’s only two of you. Do you need to do some intake form to take everybody’s input and then try to cover as much as possible? And I put my foot down that I don’t think this works. I don’t think that’s the most effective way to do it, and I don’t think hiring a researcher starting off with a service model that says, hey, you’re not part of the team. You’re an outsider who will do research and come back will be the most effective way to drive meaningful change.

    So I approached it with a lot of support and I approached it from a point of view of let’s continue that as a proof of concept. Let me hire a researcher and embed them directly in one of our most critical strategic teams that has significant strategic significance for the company as a whole, but also has a lot of open questions and a lot of things that we can benefit from a researcher, and that’s all they’re working on.

    And of course, you get the pushback of, well, what about these other teams? And my take is you’ve been operating with that research for 10 years. We can weigh it a bit more, and let’s continue gauging how things go there. And that kind of starts it off with setting up the researcher for success and empowering them as a core member of the team, challenging the notion that they’re there to take requests or answer questions and have them be able to actively predict where there will be blockers and how they can get their research ahead, maybe like three months ahead, six months ahead, to be able to actually be ready for the decisions for when the actual time to make the call comes.

    So it’s essentially making sure that research is proactive rather than reactive. And that model worked. That model worked great with that team. We hired a phenomenal researcher. Until today, you’re always excited about the first hire being a phenomenal person on the team, and you start replicating that model with different teams for areas that also are strategic to the company and have a lot of ambiguity, and that kind of becomes the framing in terms of unblocking, creating alignment, efficiency, and how we can just continue to scale from there.

    Steve: At the risk of oversimplifying, I guess I’m hearing in your answer where I was going wrong on my question, it’s the difference between this team needs a researcher and this team needs research. I was starting with this team needs research and you’re putting a researcher in there and they are figuring out the questions, being proactive, that’s very different than that service model intake form thing.

    Nizar: Yeah, correct, and generally I think teams that start off being user-centered at times think they’re doing research. There’s a lot of types of research, right? So sometimes you’re talking to, for us, you know, we’re a B2B company that has great relationship with our customers. So you think that, hey, I’m talking to their sales engineer or somebody called me for a meeting. Like, I’m doing some level of research. I kind of have a take of, you know, I’m not here to keep, go do your thing, but at the same time, I’m not here to demarcate. I’m not here to, like, empower as many people to do research as possible.

    You know, my role is to be a cross-functional stakeholder, and I will jump in with what the problems are that we need to solve together, and let me find ways to deal with them. So I think in this specific case, there was always an acknowledgement of, hey, there’s stuff that we don’t know, and we need some form of research. The definition of what research is and how it’s going to be incorporated is the thing that needed to be tightened up a bit more, and then integrating the researcher as in the framing of this is a cross-functional partner, not a source of research, if that makes sense.

    I started changing the language around the expectations of even when they’re invited, even when people go to them, even when what topics they’re having in their one-on-ones. So it’s less about, like, here are some questions that I want and more about, hey, I’m struggling with this thing and we talk through it. So they all tie in together. I think to my point earlier where there’s a little bit of the path of least resistance when you’re starting and you can pick one team, you know, that was, you know, you can call it a lucky privilege of saying, okay, there’s a team that could be ready. I’m seeing conditions that are priming a researcher to be successful here. Let’s go with that model, with that team, and continue scaling from there.

    Steve: I mean that reminds me of just your interview process, you’re looking for those conditions to understand what that is before you start and now as you grow your team, you keep looking for those conditions within different parts of the organization to see where research is going to, could go next and have the most, again you’re really focused on the impact and the product and the experience in the company.

    Nizar: That’s a good summary, and it feeds into our interview process. I do, you know, we try our best to make our interview process as applicant-friendly as possible where it’s not convoluted, but at the same time, it covers a lot. So a key part of it is who joins the team and what’s their approach as well. We do tend to see, like, there’s a specific type of researcher that tends to do best, and usually we look at researchers who do have the depth and soundness of research methodology as kind of like a core expectation. But then they layer on top of it the user-centric process and thinking, you know, when do you integrate at different stages of product development?

    You start seeing the kind of the business sense of really wanting to be integrated deeply with the team and solving the problem at heart rather than solving the open question. And then cross-functional collaboration as a core area. I do think that every researcher needs to fully understand what resources the team is working with, whether it be engineering, design, any other blockers, to be able to come forward with the most effective size of recommendation.

    And then we always have that over-layering, overarching umbrella of leadership and teamwork, really kind of looking for people who have a growth mindset, who are looking to help others succeed, who don’t necessarily see it as like their world and like their thing, but just really collectively looking for everybody to succeed together, which I think has been pretty key as we’ve scaled the culture of our team into a team that’s pretty collaborative, a team that’s looking to help each other, and a team where people aren’t competing. And there’s no incentive for people on the team to compete. There’s actually an incentive for them to make each other better and learn from each other. So that’s been an exciting part of scaling from a cultural perspective within the research team.

    Steve: I want to ask you to clarify, you used the phrase the problem at heart versus the open question.

    Nizar: Absolutely.

    Steve: Can you explain what that looks like for, what does that mean for any particular problem?

    Nizar: One thing I’ve become really sensitive to, maybe too much so, is when I see kind of a research plan that says our objective is to answer these five questions or six questions, and my take is that’s actually a step removed from what you’re going to do with the answers that you’re going to get. So I like to start it off with what’s the perceived outcome? What’s the perceived objective? What are you looking to learn and why in terms of what’s actionable? And then take that to go a step backwards and say, okay, to get to that effectively, let’s now go into what questions do we need to ask? And based on that leverage, what method is the most efficient and appropriate for what we’re trying to accomplish?

    What I’ve seen a lot in the past is even your stakeholders think they’re asking you the right questions. How many people have been asked, can you create personas? Can you tell me the different types of users? And a researcher goes off, does this for a month or two, and then they come back and nobody knows how to use them. And to me, that’s the problem that I’m trying to avoid as much as possible and just saying, okay, you want personas. What are you going to do with them? What’s the decision you’re trying to make? And often coming to the conclusion that you don’t need that at all. What you need is something much more simplified. Or we could actually get a pulse check to start getting you some signal of the answers that you’re looking for that will help with that decision-making process. And then we can decide when to iterate or if it’s necessary to iterate.

    With open questions, I find that there’s sometimes the danger of over-scoping research efforts. For what you’re trying to do with it. It’s just that outcome of you show up with a deck that has 100 slides, but the team can only act on the first two. And so my question becomes, was this the best use of the researcher’s time versus trying to focus on those first two slides and then connecting that to a longer-term program that we can then create follow-ups on as we continue to learn throughout the process. So in a way, it’s forced efficiency and early hypotheses of how we connect to the impact before even starting off with prioritizing the effort.

    Steve: I want to follow up something else that you said, you were describing a lot of the qualities that you’re looking for, the mindsets and the kind of abilities, you know, how do applicants demonstrate that information in your process?

    Nizar: I could talk about that for a long, long time. So I generally don’t believe that these buckets are a pass or a fail. I don’t think it’s are you good at user-centered thinking or not. How I see it is everything kind of sits on a continuum. And what I’m trying to optimize for is for the level of seniority that this role will require specifically, am I seeing enough of ability to handle different situations effectively that then puts them in a position where they’re going to be able to know what they need to do regardless of what’s thrown at them.

    So, for example, let’s say somebody is, we hear this a lot with the break apart of the tactical versus foundational or that, where it’s like, I do this and not the other. And my question becomes why? Why create that separation between this form of research and the other if you’re able to tell a story around pretty much your ability to come in at multiple different stages of product development and say that I can help you across every stage and I know exactly how to do it. And I can help you across every sort of limitation that you have and I know how to do it. And I can help you address multiple different type of issues that we’re facing, whether they be needing of some qualitative research or something that’s more quantified or something that’s quick and dirty or something that just needs a brainstorming workshop and I’m able to just be flexible in where I integrate with the team.

    So I know that was a long-winded answer that went all over the place, but it’s really hard to — the reason it’s hard to describe is, I really don’t think research is good or bad, yes or no kind of domain in general. And what I’m really trying to optimize for as much as possible is, does the research applicant have the breadth to be able to tackle as many problems as possible? To me, that’s a much better predictor of seniority and success than somebody coming and saying, “I did this multi-country 12-month research project that was really complex logistically,” which is impressive in its own way. It’s great, but for me, it’s not what we’re trying to optimize for in general.

    Steve: And so you’re looking at past experiences that the applicant can, I guess, describe to you, or those kinds of clues to the breadth.

    Nizar: We look both at past experiences and we look at some of the hypotheticals as well. So we do have some scenario-based questions where we try to gauge some of the thought process. It’s the thing that I tell people that there’s no right or wrong answer. You’re just going to get a hypothetical and I just want to hear how you think about it. And I want to hear what are the different considerations that you take into account or into place when you’re making your mind up about the best approach and what you’re going to do. How often are you coming in and having the hard conversations about what needs to be done versus steering the conversation in a completely different way versus just saying, “You know what? This isn’t worth the back and forth. Let me just do something quick and move forward.” So at the end of it, when we’re combining the hypothetical with the past experiences, I’m really looking for effectiveness and efficiency under the umbrella of strong and sound research.

    Steve: Those are words that are sometimes seen as at odds with each other, but I think you’re talking about how they’re in support of each other, that effective and efficient doesn’t mean that you’re not sound, doesn’t mean that you’re not, like you said, solving the problem at heart versus the open question. That seems like a key mindset that you’re bringing to this.

    Nizar: A hundred percent. And I hear that sentiment every now and then. I hear the sentiment of, “Oh, if you go too scrappy, you’re doing really terrible research.” Or, “It either has to be great research or it’s terrible research that is fast.” And I don’t agree with that mindset or that context. For me, it really depends around what you’re trying to learn, what your objectives are. If you’re trying to do something that is an extremely small pulse check, for example, you don’t need to boil the ocean.

    I still remember earlier in my career, I joined a team and they just had no idea, they knew nothing about their users. Absolutely nothing. And I was telling them, “Do you have any hypothesis? Do you have any open questions? Do you have anything there?” And they’re like, “We just don’t know. We just know that nobody’s using this feature. That’s all we know.” And we look at our dashboards, we look at our metrics, we have a target addressable market of millions, and we have tens using it. So we don’t know why. And I just came in and I said, “Look, the best use of time right now for me is just to do some sort of a small single pulse check survey. One question, pretty much trying to understand the state of everything. Just for me to have context to get started, just give me some perspective. Am I planning to use this in road-napping? Probably not, but I need some form of context from end users to be able to tell me, “Okay, I have an idea of what’s happening there, and I have an idea of the value add, I have an idea of why they’re churning, I have an idea of why maybe they’re not seeing some value. Let it be scrappy.”

    And you get the pushback of, “Well, this is qualitative. You need to do in-depth interviews for that.” I’m like, “No, I don’t. I really don’t. I don’t need to invest 40, 80 hours just to get an idea of what’s going on if you can do this in 24 hours, and then take that as an entry point into something that’s more detailed, that’s more rigorous.” So for me, it just goes back into linking the amount of effort to the projected outcome and really just finding the thing that works for next steps. And in this case, we did end up actually needing to go very in-depth with foundational interviews and a full design sprint, and then going to concept evaluations and stack rank.

    It ended up being a really complex process over maybe the course of a year that really turned around a product that wasn’t used, a product that had millions of users. But at the start of it, I did not have the luxury of saying, “I just need to go away and do in-depth interviews,” because the research domain says that qualitative is not allowed in a survey. So sometimes I think just breaking the rules in our domain is very okay, as long as you know why you’re breaking the rules and what you’re going to do with the insights that you have.

    Steve: This research effort that you’re scoping at any point may not be, or probably isn’t, the only time you’re ever going to learn anything. And so, you know, as I’m taking that away from you, then I can sort of feel some of my own anxiety just like ebbing away, like, of course, right? If you think of research as a longer-term thing, like, what’s the question we need to ask now? What’s the right amount of effort for right now? Okay, everybody, we’re not going to get everything. We’re not going to boil the ocean, as you said. There’s more to do, but here’s where we are right now. And so, yeah, that good versus bad research thing is it says we’re only going to do it once, and it’s kind of this monolith that’s either going to answer everything or not answer anything. And these gray areas you’re describing are, it’s a gentle reframe for me, I think, about where I sometimes feel anxious about trying to tamp down the commitment or the investment.

    Nizar: As long as we have the right data point for the right decision that’s being made, if we’re coming in and saying, “We need to invest all of our engineering team of this one single customer satisfaction open-ended box,” I get an anxiety attack. I get it. But sometimes that’s not the decision that you’re making. Sometimes the decision — the pros and cons that you weigh, the cons of having something that’s scrappy and fast are justified when you look at the pros of being able to get ahead and then establish a research roadmap that actually gets you ahead of the product team. So that’s the consideration for me.

    And to your point earlier, product development is iterative, and I think people forget that sometimes. People forget that even if you launch something, that team is still there, and that team will still continue to want to optimize it in some shape or form. So if anything, I feel researchers should take some comfort in that and saying that, “Okay, if I miss the boat now, how do I get ahead and say, ‘Okay, for the next iteration, for the next thing that’s happening, I’m able to get ahead and have some things ready in time?'” And acknowledge that the same way that product development is iterative, even the most foundational research efforts, you’ll end up having to iterate on in some capacity. I haven’t seen a world where a research deck is still relevant years later, and nobody has ever touched that topic again.

    Of course, you want to minimize how often we redo work that’s already been done, but everything changes. Once you start having a user base, the kind of data that you have is different. In this case, when you have tens of users and you go into the thousands, the kind of feedback you’re starting to hear is already different. The usage data that you’re starting to get is different. You’re able to use telemetry a little bit more than when you had nobody. You start being able to triangulate in a way that you just weren’t able to earlier. So iterations are good, and that doesn’t mean don’t do really deep foundational generative efforts. They’re just a time and place to say, “This is my time to get scrappy, and this is my time to dive deeper into the topic.”

    Steve: If I didn’t think about it too deeply, I might, you know, sort of have this reflex that says, well, when we know nothing, that’s when we have to learn everything, that the foundational work comes at the beginning. But you’ve got a number of examples where you’re coming in and seeing a big gap and saying, no, it’s not, this is not the boil the ocean time, it’s the quick win or the thing that we can act on or the scrappy thing. And no one believes that A is B. No one believes that the scrappy quick thing is, in fact, going to answer all the questions. But you’re helping take action, you know, within the constraints that are there.

    Nizar: I’d say caveats. Tell people there are limitations to what I’m doing. We are aware of that. Every research method we do has limitations, and I’m yet to come across any research study that has solved everything or is now claiming that we have learned everything about our entire user base or our entire feature area. And that’s the reason researchers continue to be in the same role or on the same team for years. There’s always a lot to uncover. And a lot of the time, just really weighing the cons of coming in early and saying, “You know what? I just started. I’ve been here for a week. Let me go disappear for three months or so.”

    And I get it. There are ways that you can incorporate your cross-functional partners. But for the most part, especially as somebody’s building credibility, starting with data is much more effective than starting with raw data and giving you some form of, “Here are some next steps,” where often the next steps are research. When I did that Pulse survey early on, the next steps were research. Now I needed to go and actually do in-depth interviews to learn more, but at least I had some litmus of, “What am I talking about? What is my script going to have?” So I’m not finding myself in a situation where I’m interviewing, even if it’s 10 end users, I’d be like, “Can you tell me anything? I don’t know where to start. The team doesn’t know where to start.” But I had something. And for me, the value of that effort, even if it just fed into the definition of a research template for the next steps, that was value-adding, and that saved me a lot of time.

    Steve: I’m going to switch topics a little bit here and go back to something you said, I don’t know, before. And I just — maybe you can unpack this or just clarify it. I think you were saying that, you know, that you look at, for people on the research team, you look at number of references or citations of research work in the work product of other cross-functional teams. I’m saying this really poorly, but did I capture that at all?

    Nizar: It’s a good summary, and I’ll also give it an asterisk and say, “Not as the only signal, but it’s an effective signal.”

    Steve: Yeah. So I have a bias against that. And that’s, of course, coming from someone that doesn’t work inside an organization. So my bias is maybe just hypothetical, but — or just from conversations. And maybe that — maybe that asterisk is really, really important. I agree it’s a signal. I do worry about researchers either getting external pressure or pressuring themselves that this thing, which is essentially out of their control, whether somebody else does something or not, is kind of — is a measure of their worth. Where there’s lots of reasons why people don’t do things and don’t listen to things. And I think you’re talking so much about how to — how to prevent that from happening. Right? The right work at the right time, with the right collaboration, with the right understanding, and all that stuff being kind of scaled appropriately. But, you know, just having spent my career giving people stuff that we’ve agreed was going to be important. And then seeing all kinds of things happen and don’t happen. And to a certain point, there’s a certain amount of surrender, right? Like, I’m going to give you everything that we agreed you need and maybe more, but I can’t control what happens.

    So I don’t know. I don’t want to frame this as a debate or anything like that. But I’m open to you telling me that, like, I’m wrong, that I’m framing that wrong. I’m just curious what — you know, how you think about this. It’s not the only signal, but how do you think about sort of how to use that signal or how we should all think about that signal of what somebody else does?

    Nizar: The conversation is super valid. That’s where the asterisk comes in. And if anything, I always love the counter perspectives here as well. The reason that I added asterisks here, too, is exactly what you’re saying, that you can’t really control what somebody else does, where it ends up putting some emphasis is encouraging research teams to be very strategic in terms of where they’re prioritizing their time and how they have ownership over the product direction as well. But that doesn’t only go on the researcher. A lot of the conversations that have to take place as well do have to happen at the leadership level. And kind of talking about if we’re to say that the researcher also is to be held accountable for what’s going on there, what’s the collaboration model, and where are they coming in, and are they left out of being able to have that, or is the expectation set that there’s some form of path for them to do it?

    And there are also multiple ways from my perspective to showcase that. I think when you look at the referencing, it’s as direct as it gets usually, but even that, it can be optimized. Sometimes you have to take it the other way. Don’t optimize for making sure that your research isn’t docs. That’s not what we’re optimizing for either. But what are the ways in which, as a research team, that for better or worse needs to continue kind of driving the narrative of the value that we bring and connecting the dots to the different decisions that have been made because of the research and the leadership role that each researcher is taking and actually guiding the product roadmap? How could we make sure that we are being very intentional about collecting the evidence and documenting it and being sure that we’re telling our story in a way that does a service to the team members? And often you’ll find researchers in environments where that’s just really hard with their teams. That’s just not how their stakeholders are wired.

    And when that happens, my question becomes, what’s the role of leadership and me in a lot of cases in streamlining that, but also what are other ways that are effective in gauging the success of that researcher that do not rely on that being the only mechanism? And that’s where that asterisk plays a huge role. And yeah, absolutely. There are some full quarters. We do performance reviews quarterly, which is pretty intense, and sometimes you don’t get to finish things that are meaningful in a quarter. So we want to give people the benefit of the doubt as well into how the research efforts bleed into the quarters after. But there are some quarters where you’re deep in the research, the team itself doesn’t even have a document, and there’s no way to say, “Hey, this is what’s happening.” But we look for other ways to continue connecting the dots there.

    But for me, one thing that I do genuinely care about, and maybe it’s just from previous experiences in the past of seeing where research can get thrown under the bus sometimes, I think for the past many years, I’ve been very intentional about just telling the story of the ROI of the researcher themselves. So not the ROI of research, which I think sometimes gets confused of the ROI of the researcher. I find it that often, and of course it depends environment to environment, company to company, but I find it that often people don’t debate the value of research. They sometimes debate the value of the researcher doing the research. That’s the topic that comes up here and there, and I try to be as intentional as possible to position the researcher and position the organization to give the space for the researcher to be a product leader, not only a research delivery mechanism, if that makes sense.

    And with that comes some of the expectations that end up changing. Fingers crossed it worked for me throughout my career. At the same time, I always want to acknowledge that when I say it works for me, there’s also a right time at the right place component of it, and it’s not always on the way that the research is conducted or what the researcher is doing.

    Steve: Let’s just switch topics again. We haven’t talked at all about, you know, your overall trajectory. And it’d be great maybe to get a — maybe a summary of how you found user research, what you started off doing, what some things that you did that kind of led you to this role. Maybe have set the context that we haven’t talked about for what you have been sharing.

    Nizar: Sure, yeah, I could take it many, many years back. So I actually went to the University of Jordan to study industrial engineering and I was one of the few people who actually cared to have an emphasis on human factors. I don’t know why, but I was always fascinated by the intersection of humans, computers, and business and psychology and all of these together, and I didn’t really know what you could do with it. And it was as early as undergrad that I thought that, okay, this domain seems to cover a lot of those areas, graduated and worked at a company in Jordan under a title that back then was something along the lines of process engineer or something, but in reality it was more understand the inefficiencies in the process and how people are coming in and out of their day to day and how we can make it more efficient. So it had a big kind of research component, and I was aggressively reading about what are some of those programs that I could continue learning in that space, and because where I lived, nobody knew what that was. That wasn’t the thing. And you couldn’t even say, there’s visual design, but you couldn’t really say user experience or UX research.

    And moving on to grad school, I went to San Jose State for the master’s program there, and the beauty of that was just the amount of exposure that I had to a lot of different companies and different people who are doing some of the user research work, and it was pretty much a straight shot from there. I went into consulting in a user researcher role, went into a startup where I built up from the ground up into a research and design org. So I was managing research and design, moved on to Google, at YouTube specifically, where I spent about three years, and then when the opportunity came out at Snowflake, it was just too hard to say no to that opportunity. So moved on. I’ve been there for about almost three years now, which is crazy to think about.

    Steve: Do you think of yourself as someone that has a superpower?

    Nizar: It’s a humbling question, to be honest. The thing I take pride in is I’m always open to being wrong. I’m always open to challenging the status quo and being told that there’s a better way to do it. And I think where I take some of that pride is a lot of the times I hear people that even you look up to throughout your career, and then you get to a point where you’re like, “I kind of disagree. I see a different way.” And trying to challenge status quo for something that could be better is just something that gets me pretty excited.

    Is it a superpower? I don’t know. Maybe it hinders me at times, but at the same time, especially in the conversation that’s taking place around research right now, you start seeing a lot of the consistent perspectives of this is right, this is wrong, this is what you do, this is what you don’t. And I try to be very intentional in hearing what are those different perspectives and why are they seeing things differently and what works for me and how do I acknowledge that what works for me at the environments that I’m in may not work for somebody else in the environment that they’re in as well and give people the benefit of the doubt and keep running with what I’m doing.

    Steve: There’s sort of two facets, I think. You started off saying that you’re okay being wrong yourself, but you’re also looking for when the conventional wisdom is wrong. Did I get that right? There’s sort of two aspects. It’s like you’re willing to forgo needing to be right, but you’re also embracing or curious about, hey, maybe something out there that’s established as right, the status quo, like you said. Maybe that’s wrong and you’re like challenging that.

    Nizar: And think of it like they’re intertwined. Think of it as somebody who is a user researcher. For the most part, you’re kind of looking for best practices, you’re looking for the perspective, you’re looking for the voice of the crowd. They’re kind of intertwined a bit. And I think it’s a solid starting point. It’s much better than starting from zero. Learning from someone is always significantly better than just figuring it out on your own. I mentor upcoming researchers every now and then and I say, “If you can avoid being the first researcher out of grad school, I would avoid that.” But I want to acknowledge that not everybody has the luxury of picking and choosing especially their first job. So take it and learn on the job is better than not having anything.

    But it becomes a starting point of, okay, we think this is the best practice. I guess then there’s a tough conversation of, does the best practice make sense? Does the best practice work for me and my approach? Does the best practice work for the environment that I’m within? And where do we continue optimizing it? And how do I continue doing the internal reflection, the internal research on what’s working in the processes that I’m establishing and what’s not? And how do we continue treating honestly my career trajectory as a product that you continue learning, iterating and hopefully making it better?

    Often it leaves you at odds with what a lot of voices have in place, but it’s okay accepting that as well and being like there’s no reason for everybody to align on one topic. And it’s always fun. It’s always fascinating when you’re the, I want to look at things from both perspectives. Jon Stewart came back on The Daily Show last week and he got a lot of hate because he was in between two sides. But from my perspective, these are the voices that often bring in a lot of reason and just say let’s just call out everything as it is and see how we can look inwards of how we can continue to be better. But it’s always fascinating because you’re going to get some pushback from the side that agrees on one thing and then pushback from the side that disagrees on the other if you’re optimizing in your own way.

    Steve: Are you seeing patterns in the people that you’re mentoring in terms of what topics or questions you’re helping them with?

    Nizar: I think the biggest one is there’s an obsession with the research as the end goal. I think that’s the one that’s just becoming more and more and more apparent. There’s different reasons that when somebody’s starting their career, it makes sense that they think, okay, I’m here to do research. There are some people that are more mid or later career where that’s what they’ve learned how to optimize throughout their career because that was their success criteria. So there’s various flavors of that same thing. But you often hear a lot about like I want my methodology to be, like I’m focused on my methodology or I want to do more foundational research or it’s very anchored on research as the end goal.

    Even in our interview process, we interview a lot of amazing candidates with amazing resumes and as they’re presenting their case studies, they gloss over why they did the research or they gloss over what happened after it. But they take a lot of pride in the thing that they did, kind of the actions that they took as a researcher. I think that’s the biggest, for me, gap that I see between a lot of the conversations that I have and where I believe research should be positioned as more of a tool to drive decision-making rather than an end goal.

    Steve: I don’t know a ton about how mentorship could or should work, but are there things that you are able to say or do in these interactions to help somebody shift their perspective to what you’re talking about?

    Nizar: It depends on the relationship I have with the person too. So to be honest, that kind of dictates a lot of the conversations that happen and honestly, like how hard I push back. There are some people that I used to manage in the past who I’m very comfortable telling, “You’re just absolutely wrong and stop doing it the way you’re doing it and here’s how you can be more effective.” You can’t do that with somebody you barely know. And you try to nudge it in terms of like how do you expand your thought process beyond what you’re doing into why you’re doing it? And how do we kind of like reset your tone in terms of the perceived outcomes of the work that you’re doing?

    And I do a lot of resume reviews and I think that’s a place where people seek feedback and I usually call out that a lot of resumes that I see for researchers read like job descriptions. And I try to tell them, “What’s your superpower? What’s your story?” When I read your resume and I see conduct tactical and strategic research, conduct qualitative and quantitative research, that reads like the job description that doesn’t give you an edge over other applicants and I don’t have any context over why you’re doing it, why you’re doing it, what you’re able to do, move forward. And of course, at the end of the day, the research in itself is core. That’s table stakes. Being a strong researcher with broad methodology and being able to tackle, again, different types of problems is core to the job. So we don’t want to hire a researcher who doesn’t know how to do research. But how does that researcher connect that great research to why they’re doing it and what impact that it’s having is the biggest gap that I tend to see across resumes, some interviews, some mentoring calls, that I think there’s a continued opportunity there.

    Steve: What didn’t we talk about yet today that you think we might want to cover?

    Nizar: I can interview you.

    Steve: I’m willing to try.

    Nizar: The question I have for you is you have a new edition of a phenomenal book, so congratulations. That’s something that I hope every researcher has read the book. I recommend every researcher read that book as well. Ten years later, what are the areas that you’ve seen evolve or change?

    Steve: Yeah, the context in which research takes place is totally different. You know, we didn’t have language around operations, for example, and operations as being separate. Like it took me a long time and very recently even it sort of distinguished operations from logistics. And I was, in fact, I was resistant to the idea of research ops because it takes away some of what the researcher needs to do. You know, if you try to recruit a part, recruit participants, you understand the space just by going through that. And so I had this naive view.

    Now I’m not even answering your question, but I used to have this naive view that like, oh, well, that, you know, recruiting is part of figuring out how this population thinks and works and how to work with them and so on. And that ops is going to take that away. And I think I’ve just only recently sort of started to understand that research operations is about supporting the organization to do research, not to take the burden of tactics and logistics away. It’s about sort of infrastructure and so on. So, you know, trying to have a more sophisticated conversation in the second edition about logistics and operations. These were things that like things stuff that’s really important from a legal and compliance perspective right now in research was just kind of, let’s just see if we can avoid legal finding out about this kind of in the past. I think, you know, like this podcast couldn’t really have existed 10 years ago. There were, well, I shouldn’t say that I’ve been doing it for a long time. Maybe that’s the wrong number. I don’t know.

    At some point, there were far fewer people who were doing what you’re doing, who are building teams, you know, bringing leadership and management to research. Researchers were abandoned or worked for design manager or something like that. The idea that research could be a peer. I mean, all the things started off talking about could be a peer to another function and work proactively. Those were sort of ideas and aspirations, but you didn’t see that as much. So I think, you know, the profession has matured and there are more teams with leaders with career ladders and, you know, clear ideas how to interview what they’re looking for that as an in-house profession. It’s just much, much more mature.

    And I was just thinking today about, you know, that phrase that Kate Towsey came up with, People who Do Research, like that’s, she came up with that term in 2018 as far as I was able to determine. And that’s fairly recent, but I feel like, oh, giving it a name, like there are researchers and you’ve mentioned a few times, like researchers and research. We kind of come back and forth on that there. And like you said, there’s all sorts of customer contact going on, but give, you know, creating a name that sort of says, here’s what, who’s who researchers are. And here’s this other category of people that’s also are doing research like that. Having that label, I think, clarifies a lot. A

    nd yes, we have democratization debates. And it’s not like the problem is solved by giving it a name, but we’re clearly at a point where we can say there’s different types of research happening, which is a point that you’ve made. And there’s different people doing research, whatever we mean by that. And that those are all sort of different considerations. So it’s not a solved issue, but it’s a much more clarified issue than it was 10 years ago. You know, and again, not really what you asked, but, you know, I think a lot of the fundamentals that my write about, like how to ask a question, how to ask a follow up question, how to listen. You know, I’ve had 10 more years of doing that and 10 more years of teaching it. So I’m, I have just more examples and more stories and more clarification and more nuance. So that those are the fundamentals I think of interviewing, but I think I can explain it better than I could before because it’s just practice, practice, practice.

    Nizar: This is great. You talked a lot about kind of the evolution of research and some of the things that are different. What’s the hot take about the world of user research that you find yourself disagreeing with?

    Steve: Wow.

    Nizar: Putting you on the spot there.

    Steve: Yeah, no, that’s good. I mean, I am kind of just like, I feel like that Grandpa Simpson meme of like, you know, old man, shout, yells at cloud, whatever that is, I feel like, I don’t know, just being my age and my grumpiness like I am. I don’t know. And I guess I could justify that, you know, you just the longer you live, the more sort of hype cycles you kind of go through. And I’m just sort of reluctant to engage with, with, with hype stuff. So yeah, AI is a big hype topic. I’m amazed that I’m the one bringing it up in this conversation, because usually everybody else that I talk to has to bring it up right away. So we’ve gone. So thanks for making me do that. And sometimes I’m sort of resentful about, it’s like, I don’t want to have a hot take on research. And I’m resentful about the overwhelming amount of hot takes about research.

    The most recent episode of this podcast that I posted is all about Noam Segal’s hot take on research. And so no disrespect to him for that he thought about it a lot. So I don’t know, my hot take is almost like an anti-hot take, like, can we all can everybody just chill out about AI or about the end of everything? I think we have a lot of, like short term ism or just like immediacy, we don’t we see what’s right in front of us, and we overreact. I include myself in that. And that’s not just that’s just human nature, I think. And you know, whatever, like the LinkedIn pundit info cycle, you know, entertainment complex, whatever that is, we all have to have an opinion about something and see what’s happening with research right now, but we’re at an inflection point. And so we don’t know. And maybe it’s okay not to know, which something that you’ve said a few different ways.

    So I don’t know, my hot take is like to be anti-hot take is like, it’s maybe okay. And maybe that’s just my privilege speaking, like, you know, if I was younger, trying to make a certain kind of name or get a job, I might feel like I need to come in with an opinion about something. But I think there’s a little bit of peace and calmness that I would like to, you know, nurture within myself to like not react so much to kind of the change around us, the world feels very dynamic and uncertain and complex and worrisome. And, you know, I would I like when our conversations that we have in our professions collectively, sort of soothe that and not add to that.

    Nizar: I love what you’re describing and just to add to it a little bit. I’m hearing a lot of the takes on how research is now fundamentally different. It’s doing things wrong. There’s definitely an over-exaggerating, an over-exaggerated sentiment behind what’s going on there. There’s a macroeconomic condition. You hear a lot of over statements across domains. I think when you’re in research, you just feel it more because a lot of your circles. At the heart of it, a lot of different domains got hit pretty hard. If you ask many people, they’ll tell you that it’s their area that got hit the hardest. I hear the same from product managers. I hear the same from software engineers. Let’s not even get started with the recruiting and some of the operational support.

    At the same time, I see it as always a good opportunity to just reflect on what we’re doing and what works and what doesn’t, and we continue to iterate. It doesn’t need a big research is dying kind of header to encourage some of the discussions and the conversations, how we continue to evolve. It wasn’t too long ago where every researcher was called a usability engineer, and things will continue to evolve and things will continue to change, and that’s okay. This was part of the course. I’m just excited for the continued trajectory of researchers becoming key drivers and business leaders who represent users as their core mission. But it takes a few steps, takes a few optimizations, and I think it’s just part of the exciting journey of a domain that’s still relatively young. When you look at the grand scheme of, at least in the tech world for example, the other areas that have been around as fundamental to the product development process.

    Steve: I think that’s us ending on a high note. Thank you so much for a great conversation, turning the tables a little bit and sharing so much. It was lovely to get to chat with you.

    Nizar: I really appreciate it. It was a wonderful chat, and thank you so much.

    Steve: There you go. That’s our episode. If you made it all the way to the very end, give yourself a hearty pat on the back for listening. Please spread the word about Dollars to Donuts. You can find Dollars to Donuts in most of the places that you find podcasts. You can raise awareness even more by reviewing the show on Apple podcasts or wherever it is that you’re finding it. Check out Portigal.com/podcast to find all the episodes, including show notes and transcripts. Our theme music is by Bruce Todd.

    The post 37. Nizar Saqqar of Snowflake first appeared on Portigal Consulting.
    21 February 2024, 7:21 pm
  • 1 hour 15 minutes
    36. Noam Segal returns

    This episode of Dollars to Donuts features a return visit from Noam Segal, now a Senior Research Manager at Upwork.

    AI will help us see opportunities for research that we haven’t seen. It will help us settle a bunch of debates that maybe we’ve struggled to settle before. It will help us to connect with more users, more customers, more clients, whatever you call them, from all over the world in a way which vastly improves how equitably and how inclusively we build technology products, which is something that we’ve struggled with traditionally, if we’re being honest here. – Noam Segal

    Show Links

    Help other people find Dollars to Donuts by leaving a review on Apple Podcasts.

    Transcript

    Steve Portigal: Welcome to Dollars to Donuts, the podcast where I talk with the people who lead user research in their organization.

    I went to the dentist recently for my regular teeth cleaning. I was in the chair while the hygienist was working away. This obviously wasn’t the best situation to ask a question, but I had a moment of curiosity and I found a chance between implements in my mouth. I should say that at this dentist, their cleaning process, first go over your teeth with something called an ultrasonic scaler. I had assumed this was just like an industrial strength water pick, or like a tightly focused pressure washer for the mouth. After that, they follow up with a metal scraping pick. So during the metal scraping pick portion, I asked the hygienist, “Does the water soften it?” I was wondering if the first stage softens up whatever gets removed by this mechanical pick. Somehow my weird question prompted her to give me a 101 lesson on how teeth cleaning works, what is being cleaned and how the tools are used to accomplish that. Anyway, she starts off by telling me that the water is just to cool the cleaning head. The water isn’t doing the cleaning. There’s a vibrating cleaning head that does that work. I was very excited to learn this because I had the entirely wrong mental model. I had assumed that this device was just water and I hadn’t ever perceived any mechanical tip. Of course, I’ve never seen what this device looks like, other than when it’s coming right at my face when I’m the patient. And I had made all these assumptions based on what I experienced from being in that role.

    It was a lovely reminder about how we build mental models based on incomplete information, based on our role or interaction and how powerful those mental models are. And of course, this was also a reminder of the power of asking questions, where even this simple question in non-ideal circumstances led to a lot of information that really changed how I understood a process that I was involved in.

    It was a great reminder about one aspect of why I do this work and some of the process that makes it interesting and insightful. Speaking of interesting and insightful, we’ll get to my guest, Noam Siegel, in a few minutes, but I wanted to make sure you know that I recently released a second edition of my book, “Interviewing Users.” It’s bigger and better. Two new chapters, a lot of updated content, new examples, new guest essays, and more. It’s the result of ten more years of me working as a researcher and teaching other people, as well as the changes that have happened in that time.

    As part of the “book tour,” I’ve had a lot of great conversations about user research and interviewing users, and I want to share an excerpt from my discussion with Larry Swanson that was part of his podcast, Content Strategy Insights. Let’s go to that now.

    Larry Swanson: It also reminds me, as you’re talking about that, it’s like you show up at a place like in the old days, you drive up and you’re in the car with a team, and that’s a good reminder that this is like a business activity. In fact, you open the book with a chapter about business and the business intent of your interviews, and I also like that you close the book with a chapter on impact, which I assume is about the measurement and the assessment of that satisfying that business intent. Was that bookending intentional, or am I just reading into that?

    Steve: This is where I just laugh confidently and say, “Oh, of course, you saw my organizing scheme.” I hadn’t thought about it as bookending, which that’s a little bit of nice reflecting back. In some ways, I think I was just sort of following a chronology, like why are we doing this, how do we do it, and then what happens with that? So no, but sure.

    Larry: Yeah, sorry, I didn’t mean to project on that. But anyhow, that’s sort of the — maybe just focusing on the business part of it, because I think that’s something that’s come to the fore in the content design world, and particularly the last couple years. I think it might have to do with the sort of economic environment we’re in, but also even before that, there were people talking about increasing concern with the ROI of our work and alignment with business values, and maybe we’re focusing too much on the customer and not balancing that. But how do you balance or kind of plant your business intent in your head as you go into an interviewing project?

    Steve: I think it kind of — maybe it’s like a sine wave where it kind of comes in and out. We were just talking about transitioning into talking to Marnie, a hypothetical person, for 30 minutes. I really want people’s business intent to be absent during that interview, so that’s maybe the lower part of a curve. But leading up to that, who are we going to talk to, what are we going to talk to them about, who’s going to come with us? That’s very much rooted in — I don’t know why I made up this metaphor of the sine wave, but we’re very highly indexed on the business aspect of it.

    We designed this project to address some context that we see in the business, either a request or an opportunity that we proactively identify. So we think about what decisions have to be made, what knowledge gaps are there, what’s being produced, and what will we need to help inform decisions between different paths kind of coming up.

    I talk in the book about a business opportunity or a business question and a research question. So what do we as an organization — what decisions or tasks are kind of coming up for us? So what do we have to do? We want to launch a new X. We’re revising our queue. We need to make sure that people doing these and these things have this kind of information. That’s about us. Then from that, you can produce a research question. We need to learn from other people, our users, our customers, people downstream from them, whatever that is. We need to learn from them this information so that we can then make an intelligent response to this business challenge that we’re faced with. So all the planning, all the logistics, all the tactics, what method are we going to use, what sample are we going to create, what questions are we going to ask, what collateral are we going to produce to evaluate or to use as stimulus or prompting?All that is all coming from what is the business need and how we can go at it. Yes, there still is a sideways. So then we set that aside to talk to Marnie, to talk to everybody. We really embrace them. we have all this data. We have to make sense of this data. And then here, I think we sort of straddle a little bit because you’re going to answer the questions you started out with. I think if you do a reasonable job, you’re going to have a point of view about all the things that you wanted to learn about. But you always learn something that you didn’t know that you didn’t know beforehand.

    And I think this goes to the impact piece. This goes to sort of the business thing that’s behind all this. What do you do with what we didn’t know that we didn’t know? I want there to be this universal truth like, oh, if you just show people the real opportunity, then they’ll embrace it. And then everybody makes a billion dollars and the product is successful. I think that principle from improv is of yes and. I think we have to meet our brief. We’re asked to have a perspective on something. Part of the politics or the compassion way of having impact is to not leave our teammates and stakeholders in the lurch.

    So we have these questions. We have answers to these questions. And also, we feel like there’s some other questions that we should have been asking. We we want to challenge how we framed this business question to begin with. We see there’s new opportunities. We see there’s insights here that other teams outside the scope of this initiative can benefit from.

    There’s all sorts of other stuff that you get. And I think it behooves us to be kind about how we bring that up, because no one necessarily wants a project or a thing to think about that they didn’t ask for. So how do you sort of find the learning ready moment or create that moment or find the advocate that can utilize the more that you learn that can have even more kind of impact on the business? That’s not a single moment. That’s an ongoing effort. Part of the dynamic that you have for the rest of the organization.

    Again, that was me in conversation with Larry Swanson on the Content Strategy Insights podcast. Check out the whole episode. And if you haven’t got the second edition of interviewing users yet, I encourage you to check it out. If you use the offer code donuts, that’s D O N U T S, you can get a 10 percent discount from Rosenfeld Media. You can also check out portigal.com/services to read more about the consulting work that I do for teams and organizations.

    But now let’s get to my conversation with Noam Siegel. He’s a research manager at Upwork, and he’s returning as a guest after four years. You can check out the original episode for a lot more about Noam’s background. Well, Noam, thank you for coming back to the podcast. It’s great to chat with you again.

    Noam Segal: It’s absolutely my pleasure, Steve. Great to see you and great to be here.

    Steve: Yes, if you are listening, we can see each other, but you can’t see us.
    So that’s the magic of technology, although Noam is wearing a shirt that says cat GPT on it. So we’ll see if we’re going to get into that or not.

    Noam: I do love silly t-shirts. I just ordered a few more silly t-shirts yesterday. My partner is not very happy about that particular aspect of who I am. But, you know, it is what it is and you get what you get.

    Steve: Right. You got to love all of you.

    Noam: Yeah.

    Steve: So that’s an interesting place to start. Let’s loop back. And so it’s to maybe a more normal discussion starter. We spoke for this podcast something like four years ago, early part of 2020. So, you know, I guess maybe a good place to start this conversation besides T-shirts and so on is what have you been up to professionally in the in the intervening years?

    Noam: A lot has happened. I can tell you that quite a lot given it’s not that much of a long time period in the great scheme of things.

    Steve: Yes. Mm hmm. Right.

    Noam: When we chatted last, I was working at Wealthfront, a wonderful financial technology company, and I was head of UX research there at the time.

    Steve: Right. Yes.

    Noam: I left Wealthfront for a very particular opportunity within Twitter, now X, because I was very interested in contributing to the health, so to speak, of our public conversation. And I had an opportunity to join what was known at Twitter, now X, as the health research team. But we don’t mean health as in physical or mental health. We mean indeed health as in the health of the public conversation. In other companies, these types of teams are called integrity or trust and safety, et cetera. And we were dealing with everything to do with things like misinformation and disinformation, privacy, account security, and all sorts of other trust and safety related issues. Sadly, a few months, really, or less than a year after I joined, Elon Musk took over the company. And one of the first layoffs that happened at the company were a layoff of basically the entire research team. And so I left before the layoffs, but that was the situation there. And I’d love to talk more about what that means in terms of how we build technology, et cetera. We can jump into that.

    From Twitter, now X, I moved to Meta, and I joined the team working on Facebook feed, which some people might view as the kind of brand page of Facebook or even the internet for some people. It’s a product used by billions of people daily. And it was a very interesting experience to work on both the front end of the feed and the back end as well, so to speak. So that was a very interesting experience. And in addition, I was also playing a role in what we were calling Facebook Research’s center of excellence or centers of excellence, where we were trying to improve our methodologies, our research thinking, our skills and knowledge, kind of working on how we work, which is very related to what we spoke of in the last podcast we did together a few years ago, when we talked all about research methodology, et cetera. So that was an interesting experience.

    But in April of 2023, along with a couple of tens of thousands of other people, I was laid off from Meta, as was I think approximately half of the research team at Facebook at the time. And several weeks later, I joined Upwork, which is where I work now. Upwork, for those who don’t know, briefly is a marketplace for clients looking to hire people for all sorts of jobs and freelancers who are looking to work, primarily in kind of the knowledge worker space, I would say. Upwork also caters to enterprises who are looking to leverage Upwork as a way to, you know, augment their workforce and hire freelancers or people for full-time positions as well. And at Upwork, I’m a senior research manager. I focus on the kind of the core experiences within the product, which includes the marketplace for clients and freelancers, in addition to everything to do with payments and taxes and work management and trust and safety, which I’m very happy to still be involved in. It’s a topic I care a lot about.

    Steve: For Twitter, because you were particularly interested in that health, that trust and safety aspect of it, I guess I want to ask why, what is it about that part of designing things for people to use that as a researcher or as a person that it’s something that you strongly connect to?

    Noam: Yeah, I think we have a set of societal ills, let’s call it, very troubling societal ills that I think we need to address urgently and with great care and with great responsibility. And one of those societal ills is the evolution of public conversation, of how we interact with each other as people and how hateful and nasty and unkind we can be to each other. And how much information put out there online is either inaccurate or completely false.

    This really came to be more salient in my mind during the 2016 elections to the US presidency. But I think it’s become even more salient ever since for multiple reasons, including the incredible and tragic rise in antisemitism in the world over recent years and all sorts of information running about out there on the interwebs that is, again, factually incorrect around all sorts of topics. Election related, related to certain geographical regions, to certain groups, et cetera. And so this is something I care deeply about, just given my personal background, just given what I’m observing in society. When we last had a conversation, I was at Wealthfront in the fintech space, and I recall a case happening, which really shocked all of us to our core with another company. I’m not going to name the company, but it’s another fintech company. A young person tried to use this other company to make certain types of investments and trades, but he was not well versed in how that world works, how those trades works, what options are and how to use them. And he believed that he had lost an incredible amount of money that he did not have and wasn’t able to lose. And it brought him to enough depths of despair that he ended up taking his own life. And that to me was just one story of many that made it incredibly clear that we need to be responsible and ethical in how we build technology products. And that few things could be more important than working on trust and safety. So yeah, it’s definitely an area I’m passionate about.

    And we’re recording this a day after yet another senate hearing with all of the heads of different social media companies who were faced with difficult facts about the effects that they’ve had on society and on families who lost their loved ones and other incredibly tragic stories because of the way they built their platforms, because of things they ignored. And I think research can play an absolutely critical role in building trustworthy ethical experiences, responsible experiences that really matter in this world, probably more than anything else I could think of. So that’s the long answer to your question.

    Steve: What kind of information can researchers provide that can — into situations like the ones that you’re describing?

    Noam: What sort of research we should be doing or not be doing and at what level, at what altitude, if that company put more effort into the first of all, age gating the platform and ensuring that people have the knowledge and the skill to conduct certain trades, but beyond that, the usability of the platform. Going back to the basics, which we don’t do enough of, I would suggest, which is just making sure that the information one is seeing is clear, you know, and not open to interpretations that could have incredibly tragic consequences. Like thinking I lost $700,000, I think was the number when that was in fact not the case at all. So for me, it goes back to those basics. Research can inform all sorts of more nuanced reactions than the one we’re seeing.

    Another thing that happened this week while we’re recording, which demonstrates what happens when you let go of your entire trust and safety team, including researchers, was that Taylor Swift, the incredible pop singer, artist extraordinaire, she was facing something incredibly tough to face online, whether you’re a celebrity or not, which was AI-generated nude images of her, fake images obviously. These were all generated by AI such that if you searched for her name, for Taylor Swift’s name, on particular social platforms, you would see those AI-generated images and perhaps believe, because they were very realistic, that these were in fact images of Taylor Swift when they were not. The solution this company came up with was to remove any search results for the terms Taylor Swift or Taylor or Swift or any combination of her first name and last name, which is moronic. I’m not sure what adjective to use. It’s incredibly aggressive, and I think as technologists we can do a lot better than cancel an entire search query because of that sort of thing happening. I think one thing for sure is that there’s no doubt there is a need for trust and safety professionals. There is a need for trust and safety researchers. We know how to inform the responsible and ethical building of these sorts of products and how to address these issues in much more nuanced and rational ways with much better outcomes. I mean, that seems pretty obvious, but it’s clearly not obvious to some of the people leading some of these companies. I hope that changes, and I’m very proud that at Upwork we do have these teams and we are working on these things. We care deeply about the trust and safety of the people we serve.

    Steve: I mean, you’re describing a failure with Taylor Swift AI images that there’s, you know, I guess the jargon is bad actors. People are behaving in a way that’s harmful. And when that happens, when there’s a system that can be exploited or manipulated or used to cause harm, you know, I think you’re identifying like there’s a gap. The system can be used that way. But you’re saying also that without researchers, companies are not as well set up to respond to those malicious behaviors.

    Noam: Yeah.

    Steve: And I’m, I guess I’m just looking to have the dots connected for me a little bit more like, but you can see sort of the failure of the systems and the failure of the humans that are the malicious users. But in that scenario, or kind of analogous ones, how do researchers serve to either prevent or, you know, mitigate those kinds of malicious uses?
    Noam: So I can give – there are a few answers here. One example would be that at X and other companies, some of the research we did and in some companies are still doing goes into supporting content moderators and support agents and other such people who are reviewing this type of content. And the research informs building tools so they can get to those problems faster and eliminate them and get rid of that content in more efficient and more effective ways. So for the time being, as you probably know, there are often humans in the loop here reviewing content. They’re using certain tools to do so. Those tools make them better at their job, and building those tools requires research. So that’s one example I would suggest.

    Another example is that in certain companies that, again, care more about these trust and safety issues, research informs providing users with tools that enable them to control their experience. Whether it’s blocking certain people or removing certain things from their experience or a bunch of other things that we can do. But ultimately, some platforms choose to give people more agency and more control over their experience, and research has heavily informed those sorts of tools. And you end up with an experience that is catered to your needs and what you’re willing to see and what you’d prefer not to see. And I think a final example is that even though I think a lot of researchers, we think of ourselves as mostly informing the user-facing user experience. You know, the actual designs that people end up seeing when they use a product, for example, Facebook’s feed. Several researchers in our field work more on the back end of things and helping companies sharpen and calibrate their algorithms such that the content that shows up for users makes more sense. We had that at Twitter, now X. We had that at Meta. Most companies that have any sort of recommendation systems and search systems and other such systems, they’re doing a lot of research on what to showcase to users and what sorts of underlying taxonomies make sense and various tagging systems. All sorts of inputs and parameters that go into these models and adjust these models.

    To give an even more specific example, in the realm of AI, we have all sorts of parameters, right? One of those parameters is called temperature, and when you adjust the temperature parameter, it sort of influences how creative versus how fixed by nature the algorithm responds to things, right? Like, how much it kind of thinks out of the box, so to speak, versus not. When you change the temperature of an AI-based tool, that of course influences how people experience it, right? And how they experience maybe how empathetic that experience feels or how aggressive it feels or how insulting it feels and so forth and so forth. And we need a lot of research going into these things to understand how tweaking all of these parameters affects how people perceive these tools, these technologies, these experiences that we’re building. So those are just some of the ways in which research, I think, can inform these topics.
    Steve: I don’t know. You used the word health kind of early on here. There’s a quality to the experience that we have with these tools, these platforms, separate from bad actors and abuse and misinformation, disinformation. There are research questions to just set the tone or kind of create the baseline experience. It sounds like that’s what the — if you’re working on the feed at Facebook, if you’re thinking about that algorithm, you’re using research to just create a — ideally in the best situation, a healthy versus unhealthy experience. Just — I think there’s research that talks about, oh, when you compare yourself to others, if you see positive messages, you react this way. If you see negative messages, you react this way. So you’re making me realize that asking these kinds of questions around sort of the healthfulness of the experience, I think I locked in on sort of malicious behavior, bad actors, exploitation and so on. But I think I’m hearing from you that there’s just a base on it, like what’s it like to go on — you know, I mean, what’s it like to go on LinkedIn every day when people are being laid off or when people are trying to get your attention or when people are performing, as people do on all these platforms. There is an experience that research can help understand and inform fine-tuning of algorithms and sort of what’s shown to people and how in order to create the desired experience.

    Noam: Absolutely. Trust and safety is an incredibly complex space. It’s very layered. To your point, you can create more trustworthy and safe experiences if you stop bad actors from even entering the experience in the first place. And again, ATX and I imagine other companies as well, part of what we did on the research side was inform things like account security.

    And how do we help people secure their accounts and how do we make it harder for bad actors to open accounts even though their intentions are malicious? So you can create more trustworthy platforms by stopping bad actors at that stage. And then there are more lines of defense, which again, research can inform each and every one of those lines of defense to make sure that the ultimate, the end experience for each and every user is a healthy one, is a trustworthy one. And I really don’t think there’s any stage where research can’t have incredibly meaningful impact. And I just really, really hope that as we lean in even more to AI and other incredibly advanced and complex systems that to many of us are a weird and wonderful black box that we simply do not understand. I really hope that we increase our investment in trust and safety exponentially because if we don’t, I really think the results will be horrific. And it’s our responsibility as insight gathering functions, as researchers, whatever you want to call it, to take ownership of this, to advocate for this and to make sure we’re doing this in a way that matches the incredible evolution and development of these platforms. It’s just incredible to witness.

    Steve: If we were to go into the future and write the history of, I guess I’ll just call it trust and safety user research, what era are we in right now for that as a practice or an adoption?

    Noam: That’s a tough one. That’s a tough question. I think the only response I have for you now, but let’s talk again in four years or so, is that trust and safety in a sense, you mentioned bad actors earlier. Trust and safety in a sense is always this ongoing battle between forces of good, so to speak, and the forces of evil trying to catch up to each other and match each other’s capabilities and then beat the other side. With even better capabilities, I think what we’re witnessing now with AI-based systems is that the pace of innovation, the pace at which they are evolving and learning is shocking and hard to comprehend. It’s really, really hard to comprehend. Now, that’s not to say that it’s not going to take a long time before some of these systems are fully incorporated in our lives.

    We’ve been talking about self-driving cars for a very long time, and they are absolutely out there right now in the streets of San Francisco and maybe Phoenix, Arizona, and maybe a few other cities doing their thing and learning how to do their thing. But I think it’s going to be quite a while before every single vehicle on the road is a self-driving car. But that said, these systems are just getting more and more complicated. I think our ability to understand them, it’s getting very difficult. We have to figure out what tools we need to develop in order to catch up, in order for the forces of good, so to speak, to match the forces of evil. And we also need to remember that everything these systems are learning, they’re learning from us. And sadly, the human history is riddled with terrible acts and a long list of biases and isms, racism and sexism and ageism and everything else. So these AI systems are sadly learning a lot of bad things from us and implementing them, so to speak. So again, we have a great responsibility to be better because a little bit similar to a child, AI systems are learning from what we are generating. So we kind of have to be a role model to AI and we have to make sure that we’re leveraging AI, maybe somewhat ironically, to deal with issues created by the incredible development of this technology. So I hope that sort of answers the question.
    Steve: We know as researchers, right, any answer that doesn’t answer your question reveals the flaw in the question.vAnd my flaw is that I asked you to decouple user research for trust and safety from everything else. And I think you answered in a way that says, hey, this stuff is all connected. The problems, society at large, the technology, and the building of things are all connected and research is a player in that. So, yeah, you gave a bigger picture answer to my attempt to sort of segment things out. I think we’re going to come back to AI in a bit, but I wanted to ask you, in addition to sort of trust and safety that we’ve talked about over the four years and this issue of building responsibly that you’ve highlighted, are there other things that you have seen or observed about our field that you want to use this time to reflect on?
    Noam: Yeah, absolutely. I think, as I mentioned, because this happened to me as well, we’ve seen a large number of tech layoffs and certainly for research teams, but not only, of course. We’ve seen reorgs happen, major reorgs. Because, I mean, reorgs are a reality in tech, everyone who’s worked in tech knows this, but we’ve seen some major reorganizations.

    And in fact, we’ve seen entire research teams shut down, including the example I gave earlier of the team at Twitter, now X. And as part of that, we’ve seen some incredibly thought-provoking articles come out. And I’m sure you’ve read some of these. One of them from a former leader at Airbnb, Judd Antin. He was my skip level manager, wrote an article about how the UX research reckoning is here. Another incredibly interesting article was around the waves of research practice by Dave Hora. Jared Spool wrote an article about how strategic UX research is the next thing. And I think that what all of these articles had in common was some sort of discussion on the value that insight-gathering functions or research functions bring to the table. And you might not be surprised by this, but I have a hot take for you on this that I would be happy to discuss.

    Steve: Bring it on. Hot take

    Noam: Hot take time.

    Steve: I’m ready.

    Noam: Are you ready for this? So here’s the thing. If we stick to the UX research reckoning framing, I’m a bit of a stickler for words. So the relevant, or I believe that the relevant definition for reckoning that Judd meant to reference is the avenging or punishing of past mistakes or misdeeds. So basically, as UX researchers, we made some mistakes, we made some misdeeds, and now we are being punished for it by being laid off. And again, the broader point in that article, I think, is what’s the value we bring as researchers? And I am here to say that although I agree that we’ve made mistakes, this has, everything that’s happened, has very little, if anything, to do with value. To do with the value that we bring. And I think it has everything to do with valuation, which is a very different thing. And if I take Meta as an example, Meta was suffering from a tough time as a company, spending a whole lot of money on AR, VR, and other capabilities. The stock was at one of its lowest points in recent years, if not the history of the company. And so Mark Zuckerberg announced a year of efficiency. And part of his idea of efficiency was to lay off about half of the research organization. And we have to ask ourselves, is that because researchers did not bring value to the organization? And again, I would suggest not. I would suggest that these days I’ve had work, and in every company I’ve been part of, I’ve seen some incredible value brought forward by researchers. Insights that can make a huge difference to everything from the user experience to the strategy to use Jared Spool’s and others terms.

    But there’s a couple of problems. The first problem is a problem of attribution. How can you calculate the return on investment of research? How do you know and how can you record and document which decisions and which things were influenced by research and which weren’t? If I’m an engineer or designer and I’m working within my Jira or whatever platform or linear or whatever platform you’re using to manage your software development, then I have some sort of ticket. I have some sort of task. I write 10 lines of code. Everyone knows those 10 lines of code are mine or mine and other people. Everyone knows what those lines of code translate into in the experience. And so the ownership of what that experience looks like from design to engineering is clear because it’s clear who made the Figma and it’s clear who wrote the code. And everything is incredibly accurately documented. When it comes to research, when it comes to knowledge, you know, research is circulated in all sorts of ways, right? From Slack channels to presentations to a variety of meetings and one-on-one get-togethers with cross-functional partners. And in all of those meetings and all of those interactions, research is coming through in some way. But it’s incredibly fuzzy and unclear how that translates into impacts on the products. That doesn’t mean research doesn’t have value. That means it’s hard to measure the value.

    And then one more thing that I think is going on, which you probably know very well as one of the most knowledgeable people on the topic of interviews that I know, is what happens when I’m responding to your question? In this case, maybe you have some questions about what insights have we learned? What happens as I’m giving you a response? What are you doing? Make a guess.

    Steve: I’m thinking about my next question.

    Noam: You are thinking about your next question. It’s so hard to avoid that tendency. And I think in many cases, product managers, product leaders and other cross-functional partners of research, they’re taking in the research, but they’re just thinking about their next question. And to be fair, I think that one more thing that’s going on here is that we as researchers do not understand the feeling of being held accountable for certain metrics. And for millions, if not billions of dollars in revenue, that can be moved one way or the other by the quality of what we choose to build and what we choose not to build. And the roadmaps we have, the strategy we have, etc. Usually those are product leaders who are accountable for that. And we’re not. And so the pressure is on them. And so as they take in our insights, they can’t help but just think their own thoughts and think about their vision and maybe ignore certain things that we share. And then business leaders, ultimately, what do they care about? Again, valuation. The stock. That’s just how it works, which is why I said in the beginning that I don’t think this is about value at all. I think it’s about valuation. I think business leaders are optimizing their business for their valuation, for their stock price. They’re not laying off researchers because we didn’t deliver value or because we weren’t strategic enough. They’re laying off researchers and many other people because that’s one way of a few ways to become more efficient, to look good in front of your shareholders. It’s not such a complicated game. You know, we’re doing this interview a day after a particular company started offering dividends to its shareholders, and that had a very expected effect in the market on that stock. It just went up quite a little bit higher. That’s how the game works. Those are the dynamics of the market.

    And so we’re in the situation where I’m not saying we haven’t made mistakes again. I think Judd, for example, absolutely had a point when he discussed the different levels of research and the fact that we’re making a mistake by looking at usability as some sort of basic, tactical type of research that only junior researchers should do and that we shouldn’t be focused on. And that we should only be looking at higher levels and higher altitudes of research. I couldn’t agree more. I absolutely agree with Judd on that. But this basic premise of research not delivering value I think is incredibly problematic. And I don’t think it’s correct. I don’t think we need to move into some third or fourth or whatever wave of research. I just don’t see that personally. I think many of us have already been in wave one and wave two and wave three of research. We’ve already been doing strategic research. We’ve already been affecting the business level, the product level, the design level. We’ve already been conducting all sorts of research from usability to incredibly foundational, generative research. And I think we’re being very, very hard on ourselves. And I think we need to cut ourselves a little bit of slack. Just a little bit.

    Steve: I mean, I’m all about being kind to ourselves and not blaming ourselves for things that are beyond our control. We’re all susceptible to that and it’s hard to kind of watch that going on collectively. But when you’re in a situation where there is, I don’t know, a misalignment of values, like what, you know, like you said, value versus valuation. When that misalignment, that’s my word, not yours, when that exists, we can cut ourselves slack, but that’s not going to change that gap. I don’t want to say to you like, well, here’s, you know, you just outlined a systemic, deeply rooted, the nature of capitalism, it goes all the way up. How do we fix that? I guess I don’t, I think that’s not a fair question, although, you know, take a shot if you have a hot take there. Are there mindset changes or incremental steps or, you know, things that you’ve seen research teams do that acknowledge to some extent the difference between we’re not breaking value to their concern is about something else and how do we kind of meet them where they’re at?

    Noam: So look, Steve, that’s an incredibly fair question. And I do want to be crisp about the fact that, yes, we need to be doing something. Something needs to change even if I view the problem differently. But before I get to that, just to reiterate, we as researchers know very well that it’s absolutely critical to identify the problem and to identify the correct problem at the right level. So before I get to what we should do, I just want to highlight the fact that in my view the issue here is that some of us have misidentified the problem, in my opinion. And we need to be tackling the actual problem.

    And just to get to that and to pivot to kind of the second topic that we did cover last time and I want to cover again today. We did talk about in our original conversation about research methods and how we do research. And I do think, even though I believe we’ve brought a lot of value to the organizations that we work in as insight gathering functions, I do absolutely believe that given the broad evolution of the landscape we operate in, we do need to rethink how we operate. Not because we haven’t delivered value, but because the ways in which we can deliver value are rapidly changing. And I think we can now sort of extend ourselves.
    And I was very influenced by a book titled Multipliers, not sure if you’ve read it. But the basic idea of it is that there are employees within any company who are multipliers in the sense that they don’t just do great work, they make everyone else’s work even better. They level up everyone around them and they create these situations where they define incredible opportunities and they liberate people around them to get to those opportunities and to make the most out of them. They create a certain climate, which is a comfortable climate for innovation, but at the same time an intense climate where a lot of incredible things can happen. Where I’m getting at is that, and this is not surprising probably to the people listening to this, is that the era of AI is upon us and I think it’s incredibly important to acknowledge the ways in which we can extend our work and ourselves with AI tools. So I know that my mind has moved a little bit from methods, so to speak, to leveraging AI to use similar methods but at a scale that we’ve never experienced before and we’ve never been able to offer before to our partners.

    Yeah, I mean, I think there are certain paradigms in our industry that are changing and perhaps AI is even eradicating those paradigms and rendering them useless. I mean, if it’s okay, one recent example I have is that we had this paradigm that we need to make a tough choice. We’ve talked about this, you and I, a little bit. We have to make a tough choice between gathering qualitative data at small scales, which can often be okay, by the way, unless you’re developing a very complex product or unless you want to make sure that trust and safety is in the center of everything you do and then maybe you need a little bit more scale and you just couldn’t get it because you didn’t have the people to reach that scale of interviews or qualitative research. Or, of course, the other choice you could make was to gather quantitative data at any scale you like as long as you can afford it, namely by sending out surveys to hundreds or thousands of people. The issue is survey data is shallow data or thin data or whatever you want to call it, whereas I believe it was Sam Ladner who coined the term “thick data” for qualitative data. And sometimes you need that thick data and you need it at a scale that we were never able to reach before. And AI enables you to do that.

    I’ve personally witnessed tools, one of them being Genway, which are completely revolutionizing the way we conduct research. I’ve seen existing research tools, Sprig would be a good example, Lookback, there’s so many incredible tools that have incorporated AI into their workflows. And they are making paradigms like the one I mentioned, this choice between thick and thin data, they’re making them irrelevant, absolutely irrelevant. Which is very interesting to me. And it ties to this idea of multipliers, this idea in this book I love. Because AI research tools, like the ones I mentioned and so many more that we could talk about all day, they enable us, in a sense, to be multipliers. They liberate us, in a sense, to do a lot more than we could ever do before. And hopefully that translates into us enabling our cross-functional partners and the teams we work in to deliver their best thinking and their best work as well. So that’s, I think, where our field is going in a nutshell.

    Steve: Can you describe with maybe a little bit of specificity what a work process or set of work tasks that a researcher might go through where AI tools like the ones you’re describing, like how is that, yeah, what are they doing, what’s kind of coming to them and, you know, what does that process look like that’s AI enabled?

    Noam: I can give a couple of examples. The first example, if I think of a tool like Genway, an interview tool, is that interviewing is tough, as you know well. You’ve written what I consider, and many people in our industry consider, kind of the Bible of interviewing people. No offense to the actual Bible. And as someone who’s written one of the primary guides to how to interview people, I think you appreciate more than others how complex being an interviewer can be. It’s something that you can learn over years and years of training and mentorship and still not nail some pretty critical aspects of interviewing. For example, asking the right, the best, the ideal follow-up question, and actually listening to what’s being told to you actively, rather than thinking about that follow-up question all the time, because listening is what enables you to ask a good follow-up question. Systems like these can train on an unlimited number of past interviews and an unlimited number of texts like your book, and learn from all of that how to conduct the best possible interviews, right? And these types of abilities to learn and then apply that learning in an interview situation, I believe it’s fair to say it would be technically impossible for any researcher to achieve that level of learning, certainly in a matter of hours or days or months at the most or weeks, rather than years in the case of a researcher.

    You know, one of my hot takes, I hope the audience doesn’t kill me for this, is that David Letterman interviewed people for many, many years. And I personally think David Letterman is a horrible interviewer. I never understood why he asked the questions that he did, and everything about his interview style is very, very odd to me. But putting that aside, interviewing is a very complex skill. None of us can really ever witness how other people do it. And we all have to spend years of practice learning how to become better interviewers, which is a deceptively difficult skill to build. And these AI tools are coming in and, at least in theory, can learn all of that shockingly quickly. That’s one example, and I’m very curious to see how the research community responds to these types of tools and uses that, and what issues they do find in the quality of these types of interviews and how they can be improved.

    The second example is that I recall from even my undergrad psychology studies, not to mention my graduate studies, that our ability to hold information in our brain is quite limited. And so even when we’re synthesizing five interviews, not to mention 500, because sometimes you need 500, it’s very, very challenging. If you do five 30-minute in-depth interviews with people, organizing your thoughts and synthesizing those interviews has never been a trivial task. And I think there are a large number of biases and other issues and strange heuristics that we use to synthesize information that might not lead to the optimal outcome, an outcome that’s as objective and accurate, an accurate representation of the entirety of those interviews and how they interact with each other as we would want them to be. One particular task that generative AI and AI in general is very good at is summarizing and synthesizing information. And especially as we collect more information, that becomes a lot more relevant and even critical, I’d suggest.

    When we entered the big data era, we needed to develop a bunch of tools. You know, so many companies came out of that era building tools that enabled us to analyze and very easily visualize in beautiful dashboards what those data are telling us. Now, we can also start collecting qualitative data at unimaginable scales. And not just qualitative or quantitative data, because I think that distinction is going to matter less and less as time goes by, but more importantly, we will be and become so much closer to the people we serve, our users, our customers. I think we’ve talked about this in the previous podcast, but I think we talk about diary studies and how diary studies used to be physically sent to people’s mailboxes, right? And so you as a researcher had to plan your studies, send out an actual diary, have people log their entries into it, and then they would have to send it back, and then you would have to very manually look into those entries. And obviously that takes a very long time. These days, and especially with the support of AI tools, you can be in touch with the people you serve all the time, as much as you want and as much as they want, and you can both collect data and synthesize data and even communicate those insights at a pace that’s hard to even fathom, for me at least. But it’s very immediate. Can you say very immediate or can you not modify the word immediate? Is something either immediate or not? Okay, I don’t know. I’m just afraid of my mother and what she might say here about my grammatical choices. But anyway, yeah, yeah. But I think that’s what matters the most.

    Steve: We’ll ask her to fast forward over this part.

    Noam: People’s schedules don’t really matter anymore because they can choose to interact with AI, for example, whenever they want to. And it can be in context in real time. And then AI can immediately synthesize those learnings. And it can immediately improve the way it collects insights based on that interaction and all previous interactions. I was thinking about this a lot in this framing of multipliers. Like, who is the multiplier in this context?

    Steve: What does this hold for researchers? Like there’s research, which I think you’re describing a really audacious vision for how, what research will be and I think speaks to the point about valuation versus value, but researchers, which currently refers to humans, what do you, what’s your vision for that or your anticipation for that?

    Noam: Is it the AI? Is it the researchers? Like, who is multiplying whom? But what I do think is that, well, first of all, I’m a techno-optimist or whatever you want to call it. Even with everything that’s happened, even with all of the tragedies and the negative aspects of technology that we’ve discussed in this conversation and others, I am still at heart a techno-optimist. And so my deep belief is that certainly for the foreseeable future, if not beyond, AI will become a valuable extension of ourselves and our work. And I do believe that even if we don’t have to deliver more value necessarily than we already are, even if our value just goes underappreciated and there’s nothing terribly wrong with how we’ve approached things, I still think that AI will augment our work, will amplify our work, will enable us to really invite the teams we work with to do their best work ever.

    Because AI will help us see opportunities for research that we haven’t seen. It will help us settle a bunch of debates that maybe we’ve struggled to settle before. It will help us to connect with more users, more customers, more clients, whatever you call them, from all over the world in a way which vastly improves how equitably and how inclusively we build technology products, which is something that we’ve struggled with traditionally, if we’re being honest here.

    A very clear example of that is that an AI can speak a bunch of different languages and connect with people across all time zones and languages, and even mimic certain characteristics of a person so they feel more comfortable in that context. So for multiple reasons, I feel like if we’re at all concerned about getting buy-in from our partners, if we’re concerned about the value we bring, the impact we have, I definitely think AI tools can really improve our chances of getting to where we want to be. And I think it’s going to be a very long time, if ever, before these tools replace us as researchers. You know, the reason I chose to be a researcher, the reason I chose to be a psychologist is because of the incredible complexity of the human mind.

    You know, the people listening to this can’t see this, but for my friends who are physicists, for example, so I’m holding up my phone right now in front of the camera, and if I drop my phone onto my desk or onto my floor, it’s a very easy calculation for physicists to say how quickly will the phone hit the desk, and what energy will there be there when the phone hits the desk, and what’s the chances that the phone will break given the speed of its fall. Physics is a beautiful thing, but it’s also a fairly reliable scientific practice.

    You know, there are rules in physics, and they’re fairly clear. And even though physicists sometimes, on occasion, like to look down upon people like me with a PhD in psychology and a background in psychology, I think that in many ways, the field of psychology and other fields that deal with the human experience, with the human mind, they are so much fuzzier and so much more complex. And I’m saying this because I think many, many other professions will be replaced by some form of AI before researchers ever are. And that’s because of this complexity, this fuzziness that’s hard to capture. I think in many ways, our field is incredibly technical, but in other ways, our field is not technical at all.

    There’s a lot of art to it, and there are all sorts of different aspects to it. It’s a lot easier for an AI to generate a piece of code or to generate a contract or to read a mammogram or an MRI and identify something, rather than talking to another human being and understanding them deeply. I think that’s a lot more complicated. So I’m not too concerned about the research fields, but we’ll wait and see, I guess.

    Steve: So as we kind of head towards wrapping up, you know, since I’ve known of you and known you, I’ve always seen you doing different things, I guess, to be involved with the community of user research and what’s that look like for you now?

    Noam: I, like many people in our profession, have definitely been on a journey. And if I’m being honest with myself and the audience, it’s been a very challenging few years, certainly for me. And I know for many others out there, whether it’s COVID and layoffs and a bunch of other personal events and things in life that just happen to us. And that’s part of the reason why I decided, and I know quite a few other people in our community decided to do this, to pursue coaching, among other things. I took a certification in coaching. I think even with my background in psychology, I felt like there was so much more to learn in this realm. I think that it’s always been important for me to support people in our community, and I wanted to do that even better.

    And so one of the things I’m doing these days, to a limited extent, is some coaching, not just for UX researchers or UX professionals, but people in tech in general. And then I have some thoughts for the near future around sharing some of that coaching and ideas in other ways. In addition to continuing to teach in all sorts of ways, I’m still teaching at a bunch of different institutions and planning to restart some of my teaching on Maven, which is a wonderful platform for learning all sorts of things. I think the general trend in my life and career right now, and maybe this will resonate with people, is that definitely a challenging few years feel like maybe now coming out of it a little bit, ready to take on certain other challenges in addition to my role at Upwork, etc. And I know that I really want to be there to support our community in particular as we all go through rather challenging times.

    So I just invite anyone who wants to get in touch to message me on LinkedIn or email me or get in touch in any way that works for you, and I’d be happy to chat and help. And then I also thought about, and we’ll see where this goes, but as you can tell, these topics of AI and where our field is going and how it’s evolving, that’s a lot to me and I’m thinking about it constantly and want to be part of this evolution, if not revolution, in how we work. And so I’d love to have conversations similar to this, whether out in public or privately around these topics to continue to understand them. And I’m just looking forward to seeing what is next for our industry. I feel like when we spoke a few years ago, I think we had a solid sense of what’s to come. And I think in many ways we discussed things that did end up manifesting in some way or another. But in this conversation, Steve, I don’t know what we’ll be talking about in four years. If you give me the opportunity to talk to you again. And I can’t decide with myself if that’s exciting or incredibly anxiety-provoking. So I don’t know. Why not both? Or if I’m going to be an optimist or say I’m an optimist, then I’ll choose to be an optimist and say, maybe that’s exciting.

    Maybe it’s exciting that I really don’t know what’s coming down the line. But I do know that I want to thank you again so much for taking the time to do this. It’s always such a pleasure to talk to you. So thank you for giving me the opportunity.

    Steve: That’s my line, man. I’m saying thank you.

    Noam: Well, for me, it’s a special treat. Maybe I can share also with whoever’s listening to this that we did get a chance to meet in person finally. Not that long ago. And that was even more of a treat. And I really do hope that our community of research can get together more often moving forward and meet up and discuss all of these issues.

    Steve: These are some really encouraging, I think, provocative things to think about and some really positive and encouraging sentiments for everyone. And there will be show notes that go with this podcast as always and so the stuff that Noam you’ve mentioned and you know ways to get in touch with you, we’ll put that all in there so people can connect with you if, if by some chance they aren’t already connected with you. So yeah, I’ll flip it back as well and say thank you for taking the time and for thinking so deeply about this and sharing with everybody. It was lovely to have the chance to revisit with you and kind of catch up on some of these topics after a few years and. I look forward to four years from now doing this again if not sooner.

    Noam: Can’t wait. Adding it to my calendar right now.

    Steve: All right.

    Noam: Cheers, Steve. Have a good rest of the day.

    Steve: Cheers.

    Well, that’s another episode in the can. Thanks for listening. Tell everyone about Dollars to Donuts and give us a review on Apple Podcasts. You can find Dollars to Donuts in most of the places that you find podcasts. Or visit portigal.com/podcast to get all the episodes with show notes and transcripts. Our theme music is by Bruce Todd.

    The post 36. Noam Segal returns first appeared on Portigal Consulting.
    15 February 2024, 3:29 am
  • 37 minutes 26 seconds
    New episodes! New book!

    Today we’ve got a quick program note about new episodes of Dollars to Donuts, an announcement about my new book, and an interview with Steve Portigal.

    Show Links

    Introduction

    Steve Portigal: Hi, everyone, it’s Steve Portigal. The podcast has been sitting quietly for a little while now, I guess, but I’m back today with two quick announcements and a bonus.

    First of all, new episodes are coming. I don’t know exactly when but the wheels are in motion. I’m looking forward to some great conversations about leading user research, and I’m excited to share those with you.

    Second, I have a new book just about to come out. I wrote Interviewing Users in 2013, and it’s become a classic text about user research. Now it’s 2023, 10 years later, and lots of things have changed. So I’ve updated it! The book – Interviewing Users, second edition, will be published on October 17, 2023, and is available NOW for pre-order from Rosenfeld Media, at a 15% discount.

    The bonus is my conversation with Lou Rosenfeld, where we talked broadly about user research, as well as the second edition of Interviewing Users. This interview originally appeared on the Rosenfeld Review Podcast. So welcomed by the sound of the trumpeting elephant, we’ll go there now, and I’ll see you back here before too long with more episodes of Dollars to Donuts.

    The post New episodes! New book! first appeared on Portigal Consulting.
    10 October 2023, 4:10 pm
  • 55 minutes 27 seconds
    35. Danielle Smith of Express Scripts

    This episode of Dollars to Donuts features my interview with Danielle Smith, the Senior Director of Experience Research & Accessibility at Express Scripts.

    Something that I’ve really changed the way I thought about since I’ve been at Express Scripts — we are in the healthcare ecosystem. So the experiences we deliver, if they are not of quality, they do have serious repercussions on people’s lives and people’s time. We are ethically bound to measure the user experience from different perspectives. Before something launches. We have prototypes or concepts or ideas, we do our due diligence in terms of user experience research, to make sure that the thing that we’re putting out on the world doesn’t just happen to people. – Danielle Smith

    Show Links

    Follow Dollars to Donuts on Twitter and help other people find the podcast by leaving a review on Apple Podcasts.

    Transcript

    Steve Portigal: Welcome to Dollars to Donuts, the podcast where I talk with the people who lead user research in their organization.

    Many years after we moved into our house, we finally hung up our art. Sure, we had hung up individual pieces, something that was already in a frame, say. But it was always piecemeal, a nail here, a hook there. And we continued to accumulate meaningful pieces from travels or family events. And we continued to occasionally pull out the box (a moving box, made for a mirror, I believe) and spread out all the various bits and pieces and just generally fantasize about having them up.

    My goal, however, was to have a plan, an intentional way of placing the different posters, prints, photos throughout our home. Every time we would try that, I would get overwhelmed and give up. I tried taking small bites: after seeing homes with big frames and medium photographs, we chose a few photographs from our travels, blew them up, ordered specially cut mat boards, and frames. We mapped out where in our living room these would go; we essentially carved off part of the home and planned the photographs that would go there. I hoped that this would simplify the challenge of where to put the remaining posters, but we found ourselves stuck, still.

    Eventually, we opened up that box and made some hard decisions about what to hang and what to set aside, and then – before too much time could go by – arranged to get everything framed. We were inching closer, but sitting with our posters and prints, all framed, we still couldn’t figure out what to hang where and so (this shouldn’t be surprising) I got overwhelmed and gave up.

    But buried in that frustration and surrender was a recognition of what skills I’m missing – an ability to reorganize visual information spatially in a few different ways, a set of starting principles for grouping, placing, and so on. Surely someone must have this expertise and be offering it as a service?

    It turns out that, yes, there are professional picture-hangers. We called one and booked an appointment. A few weeks later, on the scheduled day, at the expected time, the doorbell rang. We answered the door and before we could finish greeting them, two men were in our foyer, one of them having magically unfolded and gently placed a table (please assume it transformed from a flat briefcase shape to a table with a soft whoop noise).

    The guy with the table handled production. The other one handled creative (and the clients). We showed him all of our framed pieces and talked a very little about what they meant, and we showed him around our house, pointing out areas we were interested in and the few pieces we had already hung. Meanwhile he was riffing constantly, throwing out ideas, getting energized, and delegating to the production guy, who began attaching extra mounting hooks and hanging wire to all of our store bought frames. After a short time, we backed off, and we watched them work.

    The “creative” began moving frames into different rooms, laying them on the floor and trying different combinations. Like a problem-solving algorithm, the solution began to appear, bit by bit. The floors throughout our house began to fill with clusters and arrangements of different prints, both thematic and visual groupings. We were called in for frequent consultations as the plan emerged. Eventually there was a plan for where everything was going to go. This was the piece we could not have done ourselves and in a short time, they had done it.

    Then came the rest of the production. They began hanging up everything. That meant figuring out where each item went – exactly, putting in nails in exactly the right spot, using a level, all the details. Especially because so many pictures were clustered, something being slightly off would really show, so perfect execution was key. This was also something we could have not pulled off ourselves.

    The final results were astonishing. Rather than hanging things in a grid, with the top edges aligned and a consistent space between each, they put together a number of clusters where the posters emanated from a central point in an almost-spiral flow. And they chose how to place different prints within that cluster in order to create a kinetic sense, such as having a poster with a bird along the left hand side, with the bird facing to the right, so that the content of the images supported the physical placement on the wall. This was not something we could have even imagined, let alone executed.

    It is immensely gratifying to be in the presence of a highly-skilled individual. When those skills are being deployed for your benefit, it adds another layer of delight. I believe that delight is further enhanced when we ourselves have tried and failed. This story is a reminder to me to seek out these magically-talented individuals and take advantage of what they have to offer, whenever I can.

    And it’s my goal to be a magically talented individual for the people I work with, someone who brings in a level of skill that isn’t achievable without my help. I strive to conduct user research with that kind of finesse, I hope that when I coach and train teams in doing research, I’m helping them see what that magical level of skill looks like and move them forward on their own path towards achieving that.

    I would love to hear from you about what you are working on and how my expertise can support you in moving your effort forward. Please keep me in mind.

    Now let’s go to my interview with Danielle Smith. She’s the Senior Director of Experience Research & Accessibility at Express Scripts.

    Steve: Well, Danielle, welcome to Dollars to Donuts. Thanks so much for being on the podcast.

    Danielle Smith: Thanks for having me.

    Steve: Why don’t we start with an introduction from you to say a little bit about what you do where you work.

    Danielle: My name is Danielle Smith. I am Senior Director of experience research and accessibility at Express script. And Express Scripts is actually Cigna company. But we are a pharmacy benefit provider. So we help with getting your prescription drugs. If you have insurance coverage, we are the companies that help with that. And my team measures the quality of the user experience from a few different perspectives. One includes user experience research. The other is our NPS program for our digitally engaged customers, and our data science team and some digital metric reporting, oh resides with my team.

    Steve: So I want to ask about your team. But I want to just back up slightly, who are the kinds of people that are having this digital experience with your company.

    Danielle: So our digital experience is primarily geared towards people who have prescription drug coverage through their health insurance provider or directly if you’re on Medicare Part D, but you are interested in getting home delivery of your maintenance medications. Our website does have kind of general information. If you do have a health plan, you want to go in there and look up a medicine see how much it costs which pharmacy But most of the functionality on our digital tools for consumers really look at the home delivery of medicine,

    Steve: and then use the phrase to kind of describe what your team does. It’s about measuring quality. Is that right?

    Danielle: Yeah,

    that’s sort of the way I think of it.

    Steve: Can you pick apart that phrase a little bit? It’s an interesting one. And I’m curious how you think about

    Danielle: something that I’ve really thought about or changed the way I thought about since I’ve been at Express Scripts, we are in the healthcare ecosystem. So the experiences we deliver if they are not, if they are poor, or if they are not of quality, they do have serious repercussions on people’s lives some people’s time. So when I think about my team, we are almost ethically bound to measure the user experience from different perspectives. Before something launches. We have prototypes or concepts or ideas, we do our due diligence in terms of user experience research, to make sure that the thing that we’re putting out on the world doesn’t just happen to people. We have some ideas It’s usability, its appropriateness. And we’ve really tried to test a test and air quotes, do research on everything before it hits the site. And some cases, of course, we can’t do that. And I’ll talk about that in a second. But we’ve tried to do, I would say far above 80%, of what hits the site goes through some level of user research related to that. And another aspect of what we do to ensure quality is to make sure it’s accessible. And most of our accessibility work has really geared towards visual impairments or different visual abilities, because we are the digital team. So my team serves as subject matter experts to help our developers aren’t marketers, and to deliver experiences that are usable by people that use screen readers or have low vision just right out of the gate. So we don’t have a separate experience for folks who may be having any sort of different visual abilities. And that was a big point of focus for the first few years of my role here was to just really get that going. But it’s part of our user experience, practice and it’s in our The research team because we do specifically do studies, blind and low vision users throughout the year, to make sure even though it’s something’s technically compliant, it is actually usable by folks that use a screen reader specifically. But following good design practice, and rolling in those things before you launch still does not guarantee good experiences. And so the other parts of my team work to measure how what we build interacts with reality, and to see if it does actually generate a good experience and has, you know, high quality like we want. So the NPS program I know NPS itself is a problematic metric and has its taters in the industry. And that’s fine. I’m not here to defend it. But what it does is gives it lets us speak in a voice that executives and understand and it gives us the leeway to have an open channel with our users. So we send out these monthly surveys for one part of the program. But another part of the program allows users to leave feedback directly in our mobile app on our website. Tell us what’s going on. So yeah, that gave us a score, report the score, executive lift score. But the words that they use when they tell us if we’re on the mark or off the mark have been priceless and making sure that things actually work as intended given reality. And then the other piece is behavioral analytics. So we run a B tests, we instrument the site and make sure that things are working the way we think they should work in terms of people flowing through different funnels or parts of the site and monitor the experience that way. So we work really closely with our product teams to help them understand their metrics, make sure they’re gathering metrics and help them use them and interpret them in the right ways. So when you use the word quality, what does that mean for you? So for me, I’m using quality to be synonymous with a good user experience and healthcare so complicated. I’m not going to be so arrogant as to say we’re going to delight people. I just want them to be able to get their task accomplished which is getting their drugs prescriptions delivered. or checking the status of their delivery. I want them to get their tasks accomplished with ease. And that’s what I mean by quality. Are we not getting in their way? Do we make them feel like they’re part of the process? And do they understand what’s going on? Because it’s complicated enough.

    Steve: You also use the phrase, I hope I get this right. working as intended given reality.

    Danielle: Oh, yeah. Reality when you’re talking about healthcare is something that we cannot ignore. And we can’t we can’t even formulate all of the scenarios a person might be in at least you can’t do that yet. And people’s realities may be like right now like, Okay, well, now my doctor’s office is closed, and I need a refill. How do I do that? It’s a pandemic, what’s going on. So we have to be making sure that we are listening to user feedback to make sure we update the website, and even communications and talk to our friends in the call center to make sure that we are ready to respond to our users. reality, smaller, like everyday situations happen. As you can imagine, if you’re sick, there are some conditions that give you kind of a transient low vision situation. So you use the website yesterday, I, my team wants to make sure you can use website tomorrow. Just because you’re on this medication or you have this condition should not take away your ability to use certain tools. We also have lots of families, family situations are fluid who has access to your account today, you might not want them to have access to your account tomorrow. And this is your health information. So we have to be always ready to listen and react when people’s contexts change, or their actual reality is way more complicated than we even thought of when we were designing the user interface.

    Steve: Can you say a little more now about the word measure is kind of the key verb that you use to describe kind of overall what you all are doing.

    Danielle: Sure, and I’m glad you picked up on that because I don’t want to give people the wrong impression that Don’t do discovery or you know the call, but I am of the belief that the work that we do is still measurement. We may not have a type metric, we might not have a number around it, but it is still the collecting of data to get at some underlying construct as best as we can. And I was recently reminded that I came from a grad program that was more quiet leaning. And it’s probably why I did grow my team, the way that it grows more, including analytics, as well as user research and having them live side by side. So it’s not just about things we can put a number one it is about understanding scenarios and understanding people and listening to how we can resonate with our users and more. So I don’t want to miss categorize what we do, but I think of measure in the broad sense of the word.

    Steve: I mean, it’s a really powerful frame for what research does and there are lots of different frames for you know, if you have to say one word, what is it that we do and measure is One that I have heard all that often, but it’s very compelling, especially when you kind of explain, hey, there’s lots of different kinds of measurements. And even when you’re talking before about NPS, and part of its value as a way to open up a conversation with other people in the organization, when you say measure overall, I imagine that that is similar that by framing this around measurement, you are positioning yourself relative to your stakeholders and colleagues that this is the kind of guidance and information that you can bring.

    Danielle: Yeah, and I talked about it that way I talked about what my team does is x data sounds very superhero like that. It’s like it’s data about the user experience. And just because I say data does not mean it’s all quantitative, and it’s taken a while for our partners and even some folks on my team to get on board with this idea. But we need to be able to have some sort of convergence on the user experience. And convergent validity is a goal of most research teams that I’ve been exposed to where you want to have some analytics show problems and user research give you the other side of it, maybe some survey data to give you some aspect of scale of some of those opinions. And by us all being on the same team, it gives us that ability to be fluid in our methods and speak to the business in a way that allows them to hear us. And sometimes that way is to use NPS or a web analytic metric to get our foot in the door and expand how they view the users by layering on the qualitative and sometimes it’s vice versa. We have different stakeholders that are more interested in in qualitative data and then we overlay some quants to help them understand scale and focus and or if it’s even able to get to that level of maturity given where we are in the process.

    Steve: Accessibility is part of the group and for me it’s new to hear accessibility specifically called out, alongside other different roles. How did accessibility end up being part of what you’re focused on.

    Danielle: Oh man, it’s hilarious he was just like a line in my job description analysis like that. Okay, and I thought what that meant was, this is literally my first week on the job I thought that would be similar to my accessibility role in other parts of my other jobs I’ve had before, which was its usability person I need to be aware accessibility exists, and I enrolled in some tests in my normal course of evaluation that help the accessibility team, check their boxes, about compliance. However, I did not do my due diligence about looking at the expressor sweb site to understand what that might in my job description actually meant. Just say like at previous jobs at Dell and larger companies have specific hardware interface requirements that come from the accessibility team that kind of get handed to the usability team that we just make sure as we go through development, make sure that they get rolled in, but what I did not notice and Express Scripts with that. Back when I joined the company in 2016. They had two different websites, there was a link on the top of the site that said accessible view. And it took you to a text only view of the site, and during my first week on the job My boss is explaining this to me and how, nobody really wanted that to be the strategy and the teacher. I was like, Oh, so we need to learn how to build accessible websites, she’s like yeah, like, oh, okay, well, let me figure out how to do that. So it was a whole different a different perspective on the same problem, but it gave me that opportunity for us to say that we’re going to build a website that works for folks of different abilities, without having a separate site that was personally offensive to me as a Black person to have something separate for people who are different, really annoyed me. So, I wanted to make that go away. And having more inclusive web experience. So that became part of our world, because at the time Express skip splits really open to having a much more modern digital experience, we came up with a plan to help spread that awareness spread that skillset across the organization, one person on my team really picked up the mantle and dove into accessibility, best practices, compliance. She’s not a lawyer but building relationships across the organization to have that compliance and a network, and then diving into doing research with totally different abilities like specifically people they use screen readers because we wanted to make sure that not only did it kind of pass all the checks, but that it, like I said, was usable, and bring that video and bring those usability results into the organization, so that they can understand what it meant to be inclusive in that way. And that just really fit with the way we do user research, because the whole point. A lot of the point of user research is to bring the voice of the user outside of our walls into our walls to help people inside the company, understand the operating environment of people outside of the company. And by having a focus on accessibility within user research allows us to apply those same principles same practices that folks were kind of used to, but with a different audience, and it just really worked well. And so, I can’t remember the date but it was at least two years ago, we sunset it that outside, we have one experience, and we have a group of folks within the organization that are really passionate about this developers help each other out. Under normal situations we would actually run usability testing on site at the American Council of the Blind. To get more engagement from the community, it actually turned into a natural extension of the usability team meeting over it’s showing my age but user experience team, because, as designers, as researchers, we are responsible for inclusivity and accessibility is one part of that, and in health care, especially cannot, an all good consciousness exclude that important of a population. And so we do have focus on that and we have it in the name of the group, we want to remind people that this is who we are, what we do, and now we have this culture in my work that I almost don’t have to, but I probably sit

    Steve: right there, designers who are specifically focused on designing the accessible experiences.

    Danielle: Now, every designer is responsible for it. So, we had, like I said, this me and my team helped to make sure that add new designers have gone through whatever training modules that are we have available designed for accessibility is rolled into our design language, and it is part of the core competency of every designer, the content team of the development team to just build it into everything we do. Can you talk a little bit about some of the ways that your team I know your team covers a lot of different functions, but how you work with other parts of the organization. So Express Scripts senior healthcare org it’s a little different than a lot of the technology companies I’ve worked with before the design team, lives in it. So there is one big group that handles all of the external and internal facing technologies, and that’s a little different from what I was used to, but it works out really well here we work very closely with our technical product owners and partnerships within the business.

    Steve: How do you plan for what it is that people on your team are going to be working on.

    Danielle: So we are aligned to what the technology or so, I’d say that once a year, we hire on our big buckets of work so we know we are going to work on the website, on the site of member experience for these of students who work on the website, we’re going to work on the mobile application, and we’re going to work on, maybe one or two other properties or different experiences. And so, once we’ve aligned on that across the product team, the design team, and the front end engineering team, then we each kind of go off to our own teams and look at how much work can we actually do. So, this is an ongoing area of maturity for us as a leader of people. It’s something I’ve had to learn to take different approaches to doing, and I noticed wasn’t really your question but I’m going to talk about anyway. Go for it. Easy research design content, kind of a big deal of design, we’re all in this field because we have this level of empathy for folks to believe we want to design things that make the world better. And what I admired as a leader of people it’s, who has to manage like resources and organizational requests and put together budgets plants and stuff, it’s hard to figure out how much work we actually can do when no one on the team wants to say no to anything. And, as a leader I’ve had to figure out different ways of training manager, it’s been very illuminating because I’m sure I was this person when I was like fresh out of school to, kind of, I can do it, I can do all this stuff, and then later I’ve my work quality suffers turns out I’m working on the weekends and at night. I might start to resent the work I’m doing, and if I just raised my hand and said hey I know I said I could do all these, all these things but turns out I can only do one of them, my life the good life. So, as a leader of people I try been working really hard closely with other leaders on my team, you know they’re people on the team to help understand how much work can be done in a given situation. To help evangelize a message of how much more impact we can do if we just focus on one or two things that we know we can achieve and say no politely, but start to learn to build the muscle of No, that’s been my own personal growth area as a leader, and as a individual I’m guilty of that too, like I said, but trying to plan the work and where to focus becomes easier as you learn as a UX professional to accept things that you can do, and be cognizant and aware of what you can’t do and communicate that to your, your leadership your peers, so that everyone knows what’s possible.

    Steve: You reframe for me when you say become scribing No, as I can’t do it not know as I refuse to do it right, there’s a human limitation that we are always taking into account,

    Danielle: right, there’s a human limitation to everything. And we know that our takes time. I mean, everyone’s work takes time, but if you’re going to do a good job. I mean, you know, as a consultant, it’s hard to tell a client No, but you’re not really telling them no or cheese you’re telling them. Well, I could do that, but it would suck.

    If you want to do it well. It’s gonna take this long, or we can do, fewer things in a shorter amount of time, having that conversation is a skill that needs to be developed and it will make you much more successful in the long run.

    Steve: Makes me think that there needs to always be a flat no but it can be an example you just model. You’re coming back with peers to trade off stickers to consequences. So you’re facilitating something.

    Danielle: Yes, that’s my kind of, when I say my personal growth that that’s my, because then my growth is not just no good to say, Here’s why. I don’t think it’s a good idea, or I won’t be able to do these things here’s why. And this is what I mean time, right, this is how I think, engage other options to get this stuff, it’s not a plan now, and it’s not the Yeah, I’ll do it and I’m secretly working out.

    Steve: When I asked you about accessibility you went all the way back to your first week on the job and what that was like and we’re talking a little more now about where things are at right now. I wonder, maybe just a stepping back question, if you could describe a little more of that arc, like what you came into how your job is evolved over the years that you’ve been there and up to date where we’re at now.

    Danielle: When I came in. It was at the beginning of what we added at our technology transformation. And part of that was building a design team. So they were ugly at hand handful of folks, and I was brought in as a director to build a user research team. And that was a lot of executive conversations about the value of user research building an understanding about how we work. That’s a process where lots of quick and quick in your clips, does that mean it’s really quick, in healthcare, but doing some initial studies that demonstrate the utility of what we do, creating lots of new templates and getting feedback on presentation style so that we can communicate value and consistency and that didn’t take as long as I would have expected and it wasn’t as hard as I was expected. And by that I mean, I come from other organizations that had more or less established UX disciplines for some sort of experience if so, when I had been a consultant for a couple of years. So, what I was expecting coming in and like to have to do a lot of r&d, honestly to talk about how this is important and we needed it. I had to do very little that, and that was a real surprise. So lesson me it looks like it’s not able to super easy, it’s just that doing that work did not take up as much of my life as I thought it would. Because it was mostly an awareness problem, people just didn’t know that this existed, and once they heard about him can understand the work that we did, they became avid consumers. So it was a big we got a lot of fans. I did a couple of presentations to senior leadership, to this day, Express Scripts president we’re waiting in hallway because he back in those days it was a lot of conversations about what users are doing. And I also presented to clients a lot. So the way that our business works, like I mentioned the beginning is that we help with your prescriptions, if you have coverage with health insurance. So, our clients are really good health insurance plans are different businesses if they self fund their health insurance. And because this was a new function within the company, we wanted to make sure our clients understood that we were doing this and that, if I sent some of the survey, and they reached out to their health plan administrator, it wasn’t new news, so that was a different thing for me to have all these client conversations, and I still do that to an extent but now our sales team, kind of knows about what we do, so they can speak to it. I see pulled in a client conversations on a less frequent basis, but back then I was probably going to St. Louis, a couple times a month to show clients our disability lab and talk to them about what it is that we did for the first and probably two years after like the first couple of years, usability practice maturity, we had a couple of folks on my team to start to do analytics and data science lead and have taken over the NPS program to clean it up and systematize it, and now we have a winner I would say it’s healthy analytics practice where we can start to put things in place like a B testing and talking about how that is used in the organization and become consultants for different analysis questions. so we’ve gone from kind of not like having cabbie leaves little pieces of UX that there was a couple UX people bringing across the organization a couple of folks doing usability testing, they started building a website before I got hired, honestly, so they knew they wanted to do this they had executive buy in, but really, rolling it into product and having it not be optional check the box thing really started to happen over the last couple of years, and now let’s say almost gotten to be a management of demand like under this umbrella shield to keep my team from getting silex projects as we call it. So, and my job itself. I started off with a small team as a director. Now I’m the senior director some pretty well removed from doing research. Back then I think I managed a vendor on one project I was pretty well involved with a couple others I might have done a survey to myself, and now like I can’t, I joke all the time like I don’t know what you mean. I don’t know what it is the PowerPoint slides. I’ll barely do it. I just, I listen to what my team tells me, and I make a path for them that have impacted the organization. So my job has changed from building the competency, demonstrating that we have this competency and getting buy in to making sure that what we do the work that we do, is impactful, and I do that by creating barriers helping were a candidate, identify stakeholders and to fix some sort of weird problems in my own cover, to help in my team to understand how the business works, and vice versa. Okay, the business understand how different new parts of the business, understand how my team works and how they may or may not work with us.

    Steve: What are sideways projects

    Danielle: and sideways projects are, you are I some violence aged some big translate into like the lunch and learn or something, and someone in a different part of the business, let’s say, in our traditional IT department I say they they make dashboards for infrastructure monitoring, I make it so they heard of my team I heard it was their VR thing. They love this idea, they contact any person on my team, ask them if they can help them by running a usability test on their infrastructure monitoring thing, and that is a sideways project. I have to make sure that we don’t say yes to those kinds of things we don’t necessarily say no because they are under Resources they can point people to like I said like the sauce. But we don’t get involved with too deeply was just to preserve our family and focus, and it’s one of those tough ones. It’s not like things are unworthy. It’s just the human limitation, the team’s ability to do the work. If you’re doing that, then you’re not doing something else, because everybody is busy. If we do have spare cycles spare bandwidth, then we do consider those kinds of projects, but you shouldn’t be down.

    Steve: So you’ve talked about the healthcare ecosystem, but I assume, when you call it an ecosystem that means you’re sort of outside. You’re not a care provider for example, but I’m wondering if you’re impacted by regulatory concerns as part of your role in that ecosystem.

    Danielle: Yes, we are. And that was another thing that was surprised during my first week on the job. I have an academic background, doing user research, or research like this, and I’ve also been in industry for a good 1015 years. So, I understand, and expected that we would have MBAs that we need to put in place with our research participants, we would have to get informed consent, and we’d have to have certain ethical practices about letting people participate and back out and, you know, things that I come to expect is second nature. What I did not understand, was that healthcare is healthcare. There are quite a few laws that come into play, how you can talk about sensitive health information, and I had to get very friendly with all of our attorneys. And I will say all of them but quite a few. So my first week maybe, and that I got introduced to one of the attorneys that set in your mind. At the time, and he was like you did but let me invite people to come here and tell us what, and stuff like that they kicked out a series of meetings as me speaking with our attorney, are different, and they’re several different legal departments inside of a healthcare organization. I also learned, but he was feud with a few of them about what user research is and how we can use the data, and who we can and cannot talk to and what’s very important as a user researcher in healthcare, is that you are not soliciting information about somebody’s health condition, but it is the context of which they’re using our service. So we do have to could, and I learned this in this first couple of months, being on the job, and I now have to teach it to new researchers on the team put this person in place, but we do have to do very specific things to protect the data that I’ll share with us, because they are, as you can imagine, if you are in a usability session about using a prototype to refill a prescription, we have to make sure that none of that data is real, it’s a prototype, so you’re not pulling your health records to build this prototype, but as you get feedback he might start talking about what happened last time, and start talking about your specific prescription drugs, and my researchers have to redirect you. The reason why is because that is protected health information, and we have to be sensitive to that. And it’s just something I never really thought about coming from academia coming from industry, this idea that if we run like a common thing we do and I’m sure people still do, is in a gorilla hallway research where we just grabbed somebody, and it’s awesome sponsor of the project, give me unless you still a feedback or do an interview about how they might manage this certain situation at home. Well, you do that in healthcare, and you’re talking to an employee, and they might, as part of the usability evaluation or interview reveal some of their health conditions, and we have observers present. That might be a breach of their privacy. So, again, we have to be careful of who who observe sessions, who has access to recordings, and how we anonymize and research participant grids even when we send them out to observers we have to normalize, all of these things that I just didn’t even think of before I got you, and we I spent a lot of time with legal. Now the seas that broad term with legal for a couple of years to marginal boundary. So what we can and can’t do, so that we do, you know, he can do research more quickly, but in the beginning, it was kind of a curry let’s, let’s make sure we can recruit these people, let’s make sure that we are compensating them with fair market guidelines and not. And we have often documented the appropriate place because there’s also loss of power. It’s just an awareness of the regulatory environment we’re in how it impacts research that was such a surprise. Same thing applies for analytics, think, certain ways you cannot look at the data because it is not relevant to design, directly or because it uses your pH, your private health information.

    Steve: Maybe we can switch gears slightly here. Great to hear you talk about, as you grow the team, what kinds of things you look for in user researchers,

    Danielle: because I’ve grown the team in different ways. I look for a few different things, some things are the same. We do look for people that have some sort of demonstrated expertise, either in user research or analytics data science, you know, you can talk about your craft in cohesive coherent way and communication, again, is consistent, no matter where then where we do want to know that you can present data. One of those more intangible things that we look for and I’m can’t say that I have a perfect way of doing this is patience. One of the things that I think I’ve alluded to is that healthcare is we have a lot of rules to follow some things we cannot change very quickly, but we can work towards changing, and it may take a long time. And that can be challenging for some folks, it could be come from an environment that was very fast moving, or look at work worked at the placements where I was in charge of changing the UI so I could literally open a source file and change the words in the application, myself, and just publish it, and be done with it, and we have several rounds of review, to make sure that what we’re saying is legal for lack of a better term, summarize. So that’s something that I do look for is we don’t want to get folks in who just become really frustrated by how slow things move on the other side we also don’t want folks who are okay with slow things. So, it’s a delicate balance. We look for people who have the ability to kind of see the long view and make steps towards it, you know, push boundaries where you can but not just burn out. By doing that, other skills we look for. Honestly, it helps if you are methodologically agnostic from a user research perspective that means that you’re comfortable doing interviews, some ethnographic ethnographic methods and coursework, or something like that. Like you can do a few different methods may not be great at all of them that you don’t like I have met people who have specialized and we only do contextual interviews. That’s how we do. And this team would not be the place for someone like that. And the other thing that, again, that’s more difficult to select for guys some level of empowerment, were mentioned earlier, a little bit where you have the bounds of what you can do, and know what we’re trying to achieve and suggest studies or suggest data sets that may help with making decisions to get us there, and stead of waiting to be asked to do a thing, taking medicine, like being an order taker, like of course as collaboration where you sometimes get requests to do a study, but also having some initiative to be able to see something happening or see a problem, and suggest a way of getting more information to address

    Steve: the problem, how someone demonstrates patience, but not too much patience, or empowerment.

    Danielle: It’s all been in kind of conversations and interviews and just a description of your past work, and I know it’s very subjective and I, as I said I’m not really saying I’m great at it, and I love a better way, but we do try to ask questions about you know typical questions about like what was your involvement in projects. How did it start, how did it end. What happened with the findings and just listen to what people talk about. And that has helped a lot in knowing if you have a demonstrated history of taking initiative, or if you don’t. And if you don’t, it’s not necessarily the kind of disqualification, but we do try to ask questions to get out if you can, if you’re interested in taking initiative, and being empowered to do certain things. I have met people who are not interested. it’s like they do you really like getting requests for a specific thing and executing against it, and we try to get at in a conversation in different interview settings.

    Steve: I want to go back, you talked about some of the kinds of places you’ve worked in, you talked about getting a PhD of wondering if you could talk about if there were some decision points. How did you decide to pursue your graduate degree, and when did you learn about user research.

    Danielle: I’m sure my age, I wanted to be a doctor, and I was, and this is a bear with me. So when I was a kid I want to be a doctor, so I went to high school for health professions and he said I went down that path. I started down that path at a very early age, and when I was in high school, I met a psychologist, and it was a difficult time actually one of my classmates committed suicide. And so, the school district brought in, psychologists, and at that time I’ve been in hospitals doing rotations. As a teenager, so take that for what it means and what that is, but I saw this interaction with this, psychologists, much more interesting and helpful there when I was exposed to other other practices. So I shifted focus to becoming a psychologist in high school. That sounds like I have it all figured out. But anyway, so I did that in high school, and in undergrad, I worked as a clinical psychologists research assistant, and I didn’t like about clinical psychologists psychology was not very interesting, and even though they had interesting problems to set that experience was not, it was kind of boring I can’t think of a better way to say it, so I was in my kind of a research methods lab class, and the TA in that class asked if I want to be a research assistant for an applied cognitive psychology professor, and I said yes and started down the path of doing research on eyewitness testimony, and that was super interesting and Smit to two and a half years, my last two, two and a half years of my degree, doing that. My last year, to my senior year of college, I finally got to ask the grad students that I was working with how much cognitive psychologists make, and this was in the late 90s, and he told me like becoming like $35,000, as a professor, and as kind of freaked out because my mom did not do all this like work all these jobs, it could be through college and do this for me to come out and make about how much money she did so, like, I think it might have taken me a week to go back to my professor and tell him I was probably going to have to stop doing my research assistantship because I’m switching my major back to pre med and I was just going to become a doctor because I just needed to make more money than that. And he’s like, Whoa, let me set up some meetings with you with other parts of other professors in the department to tell you about other branches of psychology that are more profitable than becoming the professor of applied cog, and I was like, Okay, well you got a couple of weeks until I have to go to my counselor and change formation. And that’s how I found out about human factors and industrial organizational psych, and I also like I sent a man with a couple of professors to tell him about these other branches, I took a couple of courses, and apply for IO programs and human computer interaction and human factors for a fence for grad school, and I ended up going to rice that had combined IO Human Factors department, and really loved it. I took a couple classes in undergrad, like I said, and they didn’t really have a human factors class I just had read about it, so it’s kind of on the fence about IO psych but Bryce had a combined department, so it’s like okay like he figured out grad school, and took my first human factors class and it’s like, oh, this is my jam. This is my thing. Technology applied to humans, helping design technology to be more suitable to humans. Yes, super interesting I feel like I can help people and, and be able to pay my student loans. So, that’s how I ended up in the field or sticking with the field. And when I was in grad school I did a couple of internships, the one that really sealed the deal for me was, I worked in the habitability and Human Factors department at Johnson Space Center in Houston, as a contractor with Lockheed, but work in human factors department at NASA, Mike helped with disability evaluations of physical print instructions, websites, Space Station habitat layouts and all kinds of different stuff and I did a little bit of design too. So that was the thing that made that convinced me that I made the right choice because it was so interesting, and I could nerd out with all these people, like other key factors folks, and it was like meaningful work, so kind of ethic I ended up leaving NASA because I was taking too long graduate, and so I had to kind of finish my dissertation graduate early, and then around that time was like it was 2006 2007, and that’s when you started seeing kind of user experience jobs show up as call that versus Human Factors engineers or usability engineers and I ended up taking my kind of first post graduation job at Dell, actually I think I was a usability engineer for a while there. My job title change later to be a UX person. That’s how I ended up cheaper. So series of, kind of lucky events, and unlucky events but it’s not something I knew about early on in undergrad, but I do get a sense have a degree in physics.

    Steve: What was the topic of your dissertation.

    Danielle: Oh man, warning label design. So I was a risk perception and safety person, and my dissertation was about how to design warning labels for over the counter directives so how do you warn somebody to do something that they really really want to do you want to read it.

    Steve: I mean, so from this period that you’re describing the different programs and the different, different disciplines that you are discovering and maybe these internships, do you feel like any events, maybe not in the literal sense but you feel like any of that is present in the work that you’re doing now.

    Danielle: Definitely. I feel like be wrapped with my experience in terms of different methods, I had a lot of exposure early on to very complicated statistics took a lot of stats classes, my dissertation, it was a mix of ethnography interviews, and lots of regressions, so where I am now is a joining, all those things, so I familiar with offline experiences and familiar with online experiences, since the very extreme in our events in very, they need to design interactions that are flexible for different scenarios and different physical abilities, those are some things that I’ve learned at NASA, that come into play now. My experience doing like usability testing different styles of interviews and accelerate construction like all that stuff that I did in undergrad, our stuff we did at NASA at PayPal, like all the different environments I’ve been exposed to all come into play here, where I’m trying to help different types of researchers, understand work in an agile environment, even though it might not feel so natural. Like, understanding those principles and how we can consistently work to get better is all relevant. And one thing that I kind of consistently kick myself for is dropping the academic chair seconds, are the things that are learned in school that are still relevant today that I really dropped and try to distance myself from and early on in my career. Most cars for funerals, because there is this idea that, Oh, you don’t want to be to epidemic, you don’t want for folks who think that research takes too much time or you need to do all this math, so just drop all that, and I can I can take myself for that because I didn’t have to relearn it in this role. We learned a lot of the stats that I dropped and kind of the research design principles that I use a lot earlier, like a NASA and in grad school, because now we have so much quantitative data at our disposal, and a lot of us do come from a background like I do, where you did learn how to do a lot of statistics and think about the problem from a measurement perspective in terms of like chunking up a problem like okay this part should be qualitative. These are parts of the answer quantitatively these parts, we probably can ignore for a while until we get these first two chunks down, and that level of thinking has been invaluable in managing a team like this where we have to chunk up problems we have to try to answer things as much as he can, and it’s a useful skill but I do wish I could have kept can protect it more. Early in my career, now I get to bring effectiveness in really, really valuable.

    Steve: I mean maybe just building a little a little bit new if you look at the profession, maybe it’s your own organization or maybe it’s just the collective profession of user research. Are there things that you would hope to see terms of evolution or growth in the practice of user research, maybe over the next few years.

    Danielle: Yes, I do feel like there’s a lot of things that I would like to see, and I hadn’t really thought about this before so answers are not in any sort of priority or top of mind. I know that everything is Software as a Service, but there are some of us that world that’s highly regulated. So, for me tools and methods perspective, I really miss Moorea miss being able to run for having my team be able to run usability session to record things local people to have, like, analysis and work using local tools. I know it’s a crazy thing to kind of tap into a desire to see as a profession, we need a wide variety of tools and I feel us gravitating towards the same solutions that don’t really work very well for some of the hairier. The human existence, and I am going to go ahead and say that healthcare. If you think about the things that we have to design, we have their parts of our lives just people trying it, and trying to figure out your healthcare situation or trying to navigate some aspects of the government. It’s just really hard. And so those are opportunities for a user experience professional to be different, and happy. But another area, I’d like to see. And I believe my other other guests mentioned this too is a maturation and methods. We don’t see a lot of time on coming up with different approaches, and I don’t see a lot of meta analysis, and that term is to be loaded to but before you look at a breath, or even qualitative usability as we really like to see us advanced ways that we’ve looked at across time and across multiple iterations to pull out different insights and patterns that we should be looking into another is on the other side. I’ve really like to see a more user focus when it comes to data science, and this is not an express script statement but just in general, you see a lot of automation in place like bots, to help. Whatever you want, like, fill in the blank you think you’re on your I am in with a person and it’s about for you go to IBM and think you’re going to talk to a person. It’s hot, but like lots of applications of data science, and like user facing automation, we put in place to solve a problem. These are habits have been put in place to solve the problem business habits. So, in terms of user research it’s I feel like we can add a tremendous amount of value in that space to help bring that kind of work along in terms of solving real problems and real Tasks tab, and to do that you need to learn some of that. I will say, learn the methods that that world. Learn how it works at a high enough level to speak intelligently to user centered design in that context.

    Steve: Those are great. Is there anything that we should have talked about today that we didn’t get to want to make sure we cover.

    Danielle: I think we covered it but I’m away for starting to talk about the broad umbrella, that the experience research, the way that my, my team is made up way that I think is part of our ethical responsibilities, do you think that is come across in that conversation compensation but like I said earlier, we really want to make sure that serious. It’s not just happening to people that they are defined with their voices, considered as part of a solution, and that it’s continually refined. And I do feel like in some scenarios user experience professionals can be laser focused on a new for a few strains within the UI, this, that you. What does this mean in context of your users reality, and I know that can be kind of big and overblown, but you can have an approach that chunks better into reasonable solvable problems that you can then work with other folks in the organization to

    Steve: I really like how you brought us full circle from where we started at the beginning to where we ended up. Thank you for the conversation today and for being on the podcast It’s been really excellent to get to speak to you.

    Danielle: Oh, great! And I just realized that husband decided to go chop some wood while we were talking. But, okay. I think we’re having barbecue this weekend.

    Steve: Great, well, thanks again!

    Danielle: Thanks, Steve.

    The post 35. Danielle Smith of Express Scripts first appeared on Portigal Consulting.
    11 May 2020, 1:53 am
  • 59 minutes 41 seconds
    34. Amber Lindholm of Duo Security

    This episode of Dollars to Donuts features my interview with Amber Lindholm, the Head of User Research at Duo Security.

    That’s the sign of a really good researcher – it can never be just about research for research’s sake, like this is a cool project, this is a neat thing, I really wanna go in-depth and understand perceptions of XYZ with these people, if you don’t have that ability to understand the organizational and business contexts and the types of decisions that are having to be made every day by the rest of the folks in your organization, your research isn’t going to have an impact. – Amber Lindholm

    Show Links

    Follow Dollars to Donuts on Twitter and help other people find the podcast by leaving a review on Apple Podcasts.

    Transcript

    Steve Portigal: Welcome to Dollars to Donuts, the podcast where I talk with the people who lead user research in their organization.

    I read recently about a new genre of TikTok videos that featured people applying makeup, while lip syncing to standup comedy routines by John Mulaney. I believe that TikTok has it roots as a platform for lip-sync performances, and of course makeup tutorials and demonstrations are their own thing on the Internet. But how do we end up with the combination, and not just one but a whole series? How did this come about, why are people doing it, and are there other niche sub-genres or patterns that this relates to? Even if we are tempted to dismiss these behaviors as just a bunch of people being weird, we need to be doing the research to understand how, and why this is happening.

    Sure, people at TikTok should know how people are using their service. But insight about this also is valuable to other platforms like YouTube and Twitter and Instagram. How could this information give you a new perspective on user behaviors if you work at Dropbox? Or if you work for Michelin, either their travel department or their tire division? How could Nationwide Insurance make use of this?

    Our culture swerves and leaps and when these emergent behaviors poke their head up through into the mainstream, it’s an invitation to take note, and to be curious.

    This is what clients hire me to do, whether it’s to be the one that leads the investigation, to unlock the motivations and desires of current and prospective customers, or to be someone who helps their team as they themselves dig into the hidden behaviors and objectives of users. I help companies look at their own organization, their culture, and their processes to help set the stage for this kind of discovery work to flourish and have impact.

    If you’re a fan of this podcast, then remember you can support the podcast by supporting me and my practice. Please reach out to me at Portigal dot com and let’s find a way to work together.

    Okay it’s time for my interview with Amber Lindholm, she’s the head of Design Research at Duo Security.

    Steve: Well Amber, thank you for being on dollars to donuts. It’s really great to get to speak with you.

    Amber Lindholm: Hey, Steve, it’s great to be here.

    Steve: Hey, I guess I should have said, hey, hey, Amber. So I’m gonna ask you to introduce yourself.

    Amber: So I’m Amber Lindholm. I’m currently the head of design research at Duo Security, which is part of Cisco. And my background is in design, both kind of traditional print design, moving into interaction design, and then design research, which I practiced before, you know, moving into more leadership positions.

    Steve: What does do a security to

    Amber: do security, we provide various security products that help protect our customers basically access to data. So our core product is a multi factor authentication product. So when you’re going to, let’s say log into, you know, a tool that you use at work, you know, after you do your password, you get some sort of maybe a push notification or something that verifies that To you, so we help protect organizations and their data as well as individuals.

    Steve: So if I am a consumer and I go to my bank and they said, oh, we’re going to send you a text message to prove that you’re you before we can let you access your account, is that the kind of tools that you’re providing?

    Amber: Yeah, that’s, that’s a great example a lot of folks are familiar with, with us through like financial institutions. And we basically have created a product that’s super easy for organizations to roll out and get folks enrolled in. But it does that where it’s it’s verifying through a second method that you are actually who you say you are,

    Steve: eventually, let’s get to talk more about the work that you are doing to a security but since you said a little bit about some of what your background was it be great to just hear more about like you mentioned, print design. So what’s kind of the arc or the history for you from when you got into figuring out your profession to you know, how you move to where you are now.

    Amber: So I started out as I was mentioning in graphic design, so I was trying I went to University of Illinois we did really kind of what I would call more classical, like Bauhaus, you know, educational style, learning all about formal typography and form and color. And so after school, I followed that path, I found a work with a PR agency at first and then in house at a nonprofit called Rotary International. And this was up in Chicago. And I spent my time you know, designing brochures, you know, reports, invitations, billboards, all kinds of print materials. And during that time, I just remember there was this particular project I was working on where we translated most of our materials into nine languages. So it was going out around the world. And I was creating these, they were kind of these little packets that had a CD that had files for the different rotary clubs to produce their own kind of marketing materials. And I worked with the translators to you know, translate the copy in the on the files and the file names, but after that, I Didn’t know when those things were sent out in the world if they were going to be used properly if they were going to meet their needs or expectations, and at that time, I started searching around and trying to think, you know, I feel like something’s missing. And I came across, you know, the terminology then was Human Centered Design. And it was just a revelation. For me, it seems like, you know, you can create these things, but if you don’t understand how people are going to use them and what they need, you’re really gonna miss the mark. So that was a catalyst for me to continue like my education. So I found a grad program at Institute of Design in Chicago where I went and I learned, you know, how to do research. They taught all about human factors, all kinds of approaches pulled from social sciences, very qualitative in nature, learned about design strategy, I learned more about interaction design, and that was really I really felt like I found my sweet spot. I loved the research side of it. I loved going out and talking to people. And trying to understand you know who they are and how they behave and why. So when I graduated actually, right before I graduated, I did a couple of things. So one of the things was I wanted to do a little bit of living abroad. So I went for a year to New Zealand and I worked inside the government there so did a project looking, it was more like a service design customer experience project, which was really great, because it neat to get that government experience came back and my husband and I moved to Austin, just sight unseen moved down here. And I’ve had a couple of different jobs here in Austin. I started out at frog and I was still mostly focused on design. I was doing interaction design, like mobile, tablet type design, but then I started adding in the research projects, and that again, really flourished there and love doing that work. Got to do some research in China and other places. And this was more contextual work like going in people’s homes, spending a lot of time with them to understand their world and A couple other things I did after that I went and worked in healthcare for a year trying to do similar kind of research understanding. For instance, I did a project about the new parent experience. And then I went back to consulting to a consultancy called Project 202. So spent, you know, quite a few years actually being more of a consultant role working across many industries, which was fun and started leading a team that project to research and Insights Team. So we would work on, you know, tons of projects, looking at really large customer journeys. And then after consulting for so long, I decided that I wanted to really understand the other side, so to go more in house to work at a product company. So I worked at Atlassian and led a design and research team for a product called stripe, which is a video and chat tool. And then I moved to do where I’m at now. And again, I lead the research team there love helping our team grow love the side of research that really has an impact and provides value. So that’s kind of my story

    Steve: to talk about today. There’s two sort of vectors that I hear one is in house and consulting as different perspectives. But you’ve also talked about research and design. And it sounds like over the course of your career, your title at least has. Well, I guess you also said some of the work you’re doing might lean more heavily towards one or the other. It’s hard to ask the question in an open ended way as I want to. So I’ll just try the leading version. We think of research and design, like they’re different words. We call them different things. Sometimes they’re different kind of career paths. But one of the threads of your narrative that is compelling to me is that you’ve kind of worked with both those pieces of research and design. I guess I’m wondering when you think about for yourself at least and where you’re contributing where you’re adding value or how you put all these pieces together? What’s the relationship between the kind of the R word research and the D word design in your makeup and your skill set.

    Amber: I think they really complement each other. I think it makes me as a designer, it made me a stronger designer to have that you know, research mind and as a researcher, it made me stronger because I can Really make sure that the research I was doing was framed around and presented in a way that provided information relevant to the folks having to make decisions. So I always think about that. It’s like when we’re trying to figure out what to research and what to prioritize. There’s always a whole set of questions. And there are decisions that are having to be made that some are more ambiguous questions where you really don’t have a lot of information. Some are really high risk questions. So you want to make sure you’re understanding those and providing that information because people are having to make big decisions and tactical decisions and thinking about how you frame that research and present it back to folks or include them in that process is really important. So I think understanding how designers work and also you know, now knowing how product companies work, understanding the questions that product managers are trying to answer engineers are trying to answer and helping them actually articulate those questions is something that I use strive to do. So I think they just really complement each other.

    Steve: The way you describe how you want to make sure that the research you do connects and supports, I hear an echo of your brochure project where you were, you wanted that context to make sure that you knew that the work you’re doing was gonna have the outcomes and have the results. It sounds very similar, that kind of questioning of, Hey, who’s doing what and how do we make sure that we help them do what they’re going to do that that applies to your quote users as well as your quote, stakeholders or colleagues?

    Amber: Yeah, absolutely. We spend a lot of time like when I’m working with the researchers on my team, and we’re looking at a new project, or we’re looking across the projects that we could support over quarter whatever it is, we’re looking exactly at that like where can we have the most impact? Where’s the greatest need? Sometimes we’re looking at is there a new project spinning up Is there a new product manager coming in is there you know, a team that’s just starting out that’s never gone through this process. before. We really like to prioritize those kind of things as well so that we help get them off on the right foot so that they do frame those questions really well, we help show them a good process to follow. And those are really, really fruitful projects.

    Steve: What’s the split between proactive you identifying what research you want to do versus something that’s more reactive, where there’s requests coming in? Hey, can you help us do this?

    Amber: That’s interesting to try to assess out. I think it’s so it’s a little bit hard, because so at duo we have, we’re part of the product design team. There’s about 30. Folks on that team, research team. I have four researchers on a team and a research coordinator who helps Do you know all of our recruiting and scheduling and all kinds of stuff. And so none of the researchers are dedicated to a particular product because duo does have multiple products. We kind of stay as a little bit of a hub, but we start to embed like I was saying, Okay, let’s kind of embed over here on this team. Let’s embed over this on this team for this particular project or this time span. And so the researchers are communicating with the designers, I’m communicating with the design managers and product managers to learn, you know, what their roadmap looks like or what kind of questions they have coming up. So I end up really driving the prioritization of the work we’re doing based on all those conversations, looking at the priorities of the business and you know, kind of larger trajectory of where the business is going and make the decisions of like, where we can actually prioritize our time and where we’re going to have the most impact. But we do also get requests. So we work with folks outside of the kind of r&d process as well. We’ve been working with our creative team on website design. We’ve been working with technical communications folks in our like knowledge community, so those people will come to us with requests. We hold office hours every Monday, so sometimes, we’ll just provide some support or advice or kind of review things and other times we’ll decide to take on a full project.

    Steve: Can you maybe reward And a little bit and talk about how you came to do and what was research like there maybe what was the context that you that was established when you joined and what some of the progress and evolution has been since you’ve been leading the team?

    Amber: So do I guess the design team was formed about five years ago. And so Sally Carson’s, our head of design, so she was hired to come in and, you know, build up this team. And really early on, she wanted to make sure that research was part of it. So one of the maybe the second or third hire was a researcher, and His name’s Mark Thompson Kohler. He’s still with the team. And he came in and at that point, because design was so new and research with some new, the focus was, you know, on really rapid, like usability testing, right. So like, every two weeks, we’re going to do testing. We’ve got people, you know, what do you want to test and so the company was a startup, it was a lot smaller. And the idea was just sort of this like rapid, more evaluative type research. And that went on for a while. And then The team, Mark developed a set of personas that got embedded across the entire dual organization. You know, as a team continue to grow, we’re still kind of like a one person research shop, do the way the design team has grown though designers are hired that also have an interest and skillset and research. So designers also do research. When I came in, another researcher had been hired. So there were two researchers, and there wasn’t what you would call like a research team yet. So when I came in, we decided, Okay, let’s sit down and really talk about you know, who we are, what’s our mission? What’s our value? What are the strengths that we have going for us and the gaps and how do we want to move this team forward? And we did you know, some kind of off site exercises and that kind of thing. And we started at basics. So there was a ton of knowledge. There was a ton of things had been done, and we’re like, let’s make sure like we have it right. So that first quarter was a lot of poetry. together in a place where people can find all of the research that’s been done, document some of the tools and templates we have so that other people, you know, on the team can can use this stuff. And then we wanted to make sure that instead of just focusing on the evaluative work, you know, we were allowing the designers to really pick up a lot of that and that we could move into more foundational kind of generative type research projects. And so that’s what we did after that first quarter, getting all those foundations in place doing some additional trainings with the design team, the researchers, then were able to focus on these more larger, more strategic type research projects. And then we added in research Ops, we hired a research coordinator to help us so that the researchers weren’t spending you know, half their time coordinating, recruiting, maintaining our database of participants and all that good stuff. So she came in last summer. Her name’s Annie fan, she’s been just a fantastic value to the whole team. And then I hired two more researchers this spring, and now everyone, you know, sort of has their own area that They’re working on right now. And it’s been great. It’s just been fantastic.

    Steve: you’re describing some of that, I think is a pattern and a lot of organizations that give the designers the tools to do some of the evaluative work and you know, a shift in what the research team itself specializes in. What’s been the outcome for the designers or the design work that they’re doing as you’ve empowered them to do that evaluative work?

    Amber: I think it’s great. I mean, I think rather than, you know, us being like, No, we can’t help you and know that, you know, those questions aren’t going to get answered, the designers do have the toolset available. And we, you know, depending on somebody’s comfort level, or you know how much they’ve done this in the past will provide more or less support, right. So, like I said, it’s kind of like reviewing the plans, reviewing the script, doing test runs, maybe we’ll moderate the session, and then they pick it up after that, or some folks, you know, they’ve done this many times, and they can run more fully, but they are the experts in sort of this area that they’re working on. And they have the belief With their product manager and their engineers, and they can take all of that, you know, directly back into the work that they’re doing. So I think it’s I think it’s great, you know, I always have some some hesitancy if it’s the subjectivity that might come out of somebody who’s testing, you know, their own work. So we do kind of talk about that, keep an eye out for that. But so far like, it’s it’s been great and allowed our researchers to, like I said, focus on not just like those larger research projects, but a researcher on my team was Donovan right now is helping kind of consult across six or seven teams for this larger project. And so she has really good line of sight around the different questions being asked and helping people, you know, make sure that they’re all kind of sharing the insights they’re learning across those teams. So it’s allowed us to really level up our scope of influence.

    Steve: That’s a great explanation and I feel like there’s an implicit value judgments around what kind of research is cool and fun and worth our time and then We can offload some of these tasks to the lower value tasks to other people, then we can work on the cool stuff. Not that anyone has ever said that. But it feels like that’s lurking underneath this move to kind of, you know, empower people to do the value of work. But I think what I’m hearing from you is, in a lot of ways, those people are better positioned to do that work, because they’re very close to the problem. And they can kind of zoom in on how they can get the most actionable results, and do do so maybe more quickly. And that research, I think your point about having influence. So the work that you’re doing is is more around the organization and more strategic. And that’s kind of what you’re positioned to do.

    Amber: Yeah. So besides, you know, helping bring those designers together, Liz is helping to align pm as well are kind of like share across the hims. And she’s running a set of like metrics workshops across those teams like a created as a standard sort of workshop to help us to find UX metrics look like For signals and define those metrics, and so she’s bringing those to each of those teams. So now again, like it’s a more kind of strategic view across those, but then we’re still able to weigh in on and influence the the type of research, the type of questions and even the synthesis of the research that’s happening across those teams.

    Steve: What are some other things that you’re doing to whether it’s around influence or you know, kind of bringing the customers into the organization?

    Amber: When we set out on our kind of mission and the things that we do, you know, obviously sort of like executing and running this kind of product focused or foundational focus research is really important. But sort of the other half of it that we wanted to make clear about was like our role and helping to get folks closer to the customer to bring that context to them, to keep the customer Top of Mind, and I do frankly, it’s pretty easy because the organization across the board is extremely customer centric. We’re not like begging for people to listen to us. The value of research is really understood. It’s, it’s asked for actually, like, people are really interested in hearing more about the context RPMs. You know, they’re fantastic. They out there out there, they do tons of customer calls, but they’re still, as we’ve continued to grow. The risk that we identified a couple of things like early on, I mentioned that there was a set of personas that were created and these get referred to all the time across the board in big meetings, you know, big call hands and all of that. Salespeople reference them, our customer success, people reference them. But as people join the company, as we were kind of in hyper growth mode, the real deep understanding of those personas may not have persisted. Right. So we wanted to take a step back and say, you know, how do we, as a research team, help other folks have the type of context that we have. So Mark, who developed the personas, we have a program set up where, you know, as new hires come in, he presents something called personas one on one to make sure that those are really top of mind and understood and we do a deep dive session on one of the more key personas. He also does something called these fireside chats. So we just chat with different folks. So for instance, different IT administrators or help desk managers or helped us folks or CSOs, we’ll set up you know, an hour long chat to just learn about their day. And that’s open invitation for folks to learn. Listen and on. And then we also with the whole of our research sessions and things, they’re open for folks to come and listen to. So it’s very much accessible. We have a research Condor, everyone can see and they can pop in on different things. And lots of folks take advantage of that. And then I think another thing that we wanted to do was make sure that we had created additional sort of artifacts and frameworks for people to think about when they’re making decisions. So we took all this data that had been collected over across, you know, about four years of tests and interviews, and ran these large workshops, and Mark, you know, synthesize all that information into a set of design principles that are persona specific. So they’re not these like really super high level principles that are hard to make decisions with. They’re very applicable to making, you know, decisions on UI on making decisions on copy you might be creating. And so those principles are referenced by You know, designers, they’re used in critique sessions. They’re referenced by product managers as they’re talking about their roadmaps, folks like technical communications and things like that. So the personas and the principles, they’re not just an artifact that just kind of sits on the table, we bring them to the forefront and a lot of different ways.

    Steve: Could you genericized one of the principles just to give a sense of the granularity of it?

    Amber: Yeah, so for our IT administrators, for instance, one of the principles is around the ability to take action. And so anytime we’re making a decision about something in our UI or a flow, if there’s a dead end, that that person cannot actually, let’s say they’re seeing some kind of thing that may alert them to a security problem. If they cannot immediately take action on that item, then we’re not meeting that principle. It’s very important for folks to be able to move from information that they’re seeing directly into action. And then like for our end users, one of our principles is reduce friction to the extreme. So we’re not in the kind of product where, you know, you’re trying to create a sticky situation where someone’s in your product, you’re trying to actually minimize their interaction with your product, because what they’re actually trying to do is they’re, you know, get into their work application. So we think a lot about that, like, how do we reduce friction to that extreme?

    Steve: So those are specific, as you said, and meant to be actionable. Yeah. And how many principles in a project like this, how many principles should one be trying to come up with?

    Amber: What a question Steve? No, I think we narrowed it down to five principles for our two main personas and that I think is a good number like three is too few eights, too many kind of a thing. But five has been like the perfect number. And when we first rolled these out, we we did some fun things to where we created some really neat posters. We put out in all the offices, you know, we socialize them with people. We brought them around to meetings, we personified the principles even, which was really fun. But yeah, I think you got to, you can’t have too many.

    Steve: How do you personify a principle?

    Amber: So one of the designers on our team name, Andrea actually came up with this idea to, you know, get designers more familiar with all of these. And so each of the principles, one of us was given, and you had to basically just act like that principle. So we had to create a character. And in this sort of mock critique, we had to just care about that particular principle. So it’s actually ended up to be quite fun to do.

    Steve: Sounds like a good improv game was very fun. Just going back to one of the points you made was that this is a an organizational culture that believes in research and she said asks for research. Do you have any perspective on for duo how that came to be?

    Amber: I imagine it’s a couple of different factors. I mean, I think that the folks, Doug and Johnno, who co founded the company, that was like a big thing. Like they really cared about the end user experience. They cared about creating a company that was different than the other security companies out there at the time. Their whole mission is about democratizing security. So making it easy and effective was there from the beginning. And it wasn’t just, you know, something they were saying was something they really believed in, they saw how complicated you know, so many security products were and they wanted to make sure that when these products are out there, and it’s, you know, an IT administrator, it’s not somebody with, you know, a cybersecurity background or anything that these more generalist folks can protect their organization. So I do think a lot of it came from the founders and then the folks that they hired. So like our head of engineering, Chester, who we product design actually reports up through engineering. He’s completely supportive of design. You know, we hire engineers that have that kind of mindset also have like customer experience. So I think it just kind of permeates through the culture, because as you said, starts at the beginning and starts at the top. I think so. I mean, that’s that’s just my impression. I mean, I wasn’t there at the beginning. But I see that legacy. And I see that through the people that were hired on that continued to be in leadership.

    Steve: I have so many interactions in so many different venues where somebody who maybe is at an individual contributor level sees that their culture is not like the one that you’re describing. And they’re being blocked in a lot of different ways. And the question is often, like, well, what can I do to succeed this way when these things are kind of arrayed against me? I don’t know if you’ve encountered that or heard that question. I personally, I find it challenging to give them super actionable advice when I feel like you’re describing you know, hey, it starts at the top that says how this company was founded. You know, they’ve grown and hired and supportive that and if you’re in a situation Well, that hasn’t happened. It’s an interesting challenge for people that aren’t embedded in those kinds of organizations to make those kinds of changes and advocate for that.

    Amber: Yeah, I think especially like, to your point, if you’re not at a leadership level where you have a, if your scope of influence is just with your immediate project team, it’s gonna be really hard to sort of like, level that up through the organization.

    Steve: You’ve named some of the people on your team and the kinds of contributions they’ve made. And I’m just wondering how you think about, but I think you said, you just you hire two people not that long ago to join the team. Yeah, yeah. Can you describe a little bit about your approach to finding people what that looks like in terms of saying, Hey, this is someone that would fit with our organization?

    Amber: We have a probably what’s it fairly standard interview process, but I’ll just describe that and then sort of what kind of questions I’m asking and looking for, but I typically, you know, reach out to folks I have like an initial conversation with them. Just to understand what they’re looking for in an organization, or what their kind of career goals are just to make sure it aligns with, you know, the kind of goal we have open, then we do a longer interview. And we use, again, pretty standard, like behavioral interview questions to ask people to describe real scenario. So behavioral interviewing, it’s kind of like doing user research or like doing design research, where you’re really trying to get somebody to give you very specific real examples like to tell a story, because you want to understand what the situation was, how they thought about it, and how they reacted to it versus like a generalized answer. So we follow that I put together like a set of questions. And then once folks kind of go through that, they go into like a two hour we do a bit of a portfolio review. And then a final round interview is more like a four hour thing. And we have designers in there. We have PMS come in, sometimes engineering managers, so different folks who they might you know, interact with. than their day to day, participate in that more long kind of final interview. But some of the things that I’m always looking out for like, obviously, you know that they can do the work, but the way that they present the work is really important. So, you know, coming from a consulting background, I am design background, I do a lot of emphasis on how people share their work and how clear it is and how compelling it is. How do they tie that story together? How do they tell high level insights, but get into enough details to really support and paint the picture of why that insight matters? So that’s a big part of it. And then the behavioral interview questions I’m always looking at, can someone learn from a situation like can they I don’t want to say like, admit they’re wrong, but if they encountered something where it went wrong, how did they go about learning, growing and kind of improving for the next time? I’m also looking at people that are willing to ask questions, so I really love hearing the questions, I always leave a lot of time for folks to ask me questions. So do they have really good questions? Were the kind of topics they’re thinking about? Are they people who are going to come into an organization and be really curious and willing to ask people questions and not just kind of sit and be scared if they don’t know the answer, trying to think what else? We ask questions, you know, around working in diverse teams that’s really important to us, and just how they kind of collaborate and work with others.

    Steve: Yeah, I want to ask about one of the things you mentioned, telling a story about an insight and you know, help it come to life. And you’ve worked in consulting. So you’ve experienced some version of this, I think, I mean, I, in some ways, I think I’ve grappled with this for much of my career, that there are things that happen over the course of a project that are galvanizing that just change everything, but to tell somebody else outside that organizations, that’s a harder story to tell. So I’m wondering, what are the best practices for that? How do I make a story about an experience I had In the past come to life for an audience that didn’t have the assumptions, the biases, the worries, the you know, all the sort of things that make it so impactful. How do I help somebody else understand that part of the insight?

    Amber: Well, so what you just said is exactly what I would expect is that, you know, folks that know how to tell a story upfront, they give me enough context about the organization or the problem, or the assumptions that were there in the beginning, and what kind of impact their research had, and setting the stage of that context. It’s not just about the project, right, exactly what you’re saying, like, every single organization is unique. Every single set of people has different assumptions about things, the challenges that someone might be up against within that organization are going to be different. So a really good researcher is going to help you understand that context of why the thing that they’re telling you matters. I think that’s the sign of a really good researcher is that it can never just be about, you know, research for research sake like this is a cool project, this is a neat thing I really want to, you know, go in depth and understand perceptions of XYZ with these people. If you don’t have that ability to understand the organizational and business context and the types of decisions that are having to be made every day by the rest of the folks in your organization, your research isn’t going to have an impact. And so I really listen for that. And interviews are people able to help me understand how they made an impact and how they understand that larger ecosystem that they’re working in?

    Steve: So back to the archetype, I was describing the person that approaches me and I think all of us with here’s why I’m unable to have impact. Here’s how my culture says doesn’t support me. If that person comes in to speak to your team and tell stories about the research they’ve done. How do they highlight the impact of their research when there’s forces arrayed against them to have that impact?

    Amber: Yeah, I mean, I think that is a really challenging and I have had interviews where folks have revealed some of those more negative challenges, I would say like, it can sometimes feel like a person puts up all the barriers and just those like I couldn’t really do anything versus somebody saying like, this was the situation. This is how I tried to take my work and have influence at the level I could influence right. So you know, maybe it’s getting the one person to shift their perspective and get them to like, listen in on the interview, I don’t know, whatever that small measure of change or influences I think focusing on that is better than just saying, you know, there’s all these constraints and I couldn’t do anything.

    Steve: That’s good. I’m gonna use all these because we are we’re always doing storytelling and I think it just to kind of reflect on how we can most effectively communicate overcoming challenges, having success, success at different levels, all that context. Yeah. It’s a really good way of putting it. What are you seeing in terms of where people are coming from to research,

    Amber: maybe it’s helpful to describe like the folks on my team just because that’s, you know, who I’m interacting with. So one of them was actually in journalism for most of his career before transitioning into user experience research. One of them was working in academia for a while doing all kinds of different unrelated, you know, research around philosophy and and other things and then transitioned into technology actually, being a technology director for again, publication before transitioning into user research. One was a designer before making a transition. And one of them was actually doing market research for consumer packaged goods companies. So all of them had career changes before coming into the research side.

    Steve: She’s interesting. Would you include yourself under that umbrella?

    Amber: Yeah, absolutely.

    Steve: It’s just a fascinating artifact of what makes up our field. Mm hmm.

    Amber: Yeah. And my so I have an intern coming this summer. And she, again was more she’s been in the writing field. So she’s been a professor like an English professor and now working on her master’s to transition into research. So yeah, a lot of I think folks who come from like that, writing journalism, storytelling background, it’s a really neat transition to see. Do you

    Steve: have any theories as to what is it about research that pulls people in from these different backgrounds?

    Amber: I think in that case, like, there’s a easy connection there where journalism folks, they are trying to uncover information. They’re trying to understand people, they’re trying to tell stories. So that makes sense. I think other folks like the woman who was working more in technology, and then the woman who was working more in design and same with me is that you find yourself drawn to the beginning of the process where you’re really trying to understand the problem or you come into a project where you’re delivering something where you didn’t under And the people or the context, and you’ve gone through that yourself, and you want to, you’re really drawn to like, how do I make sure this doesn’t happen again? Or how do I learn more about this? And then people kind of stumble into it sometimes. You know,

    Steve: I’ve been identifying as a researcher for a long time. And I always felt like yeah, I stumbled into this. But that was more it felt like that was more common because the professional wasn’t defined. We didn’t have, let’s say, the program, instead of design that that you went through, like, I don’t think that existed. If it did, no one knew about it. So I feel like oh, I came from this sort of prehistoric era where field was nascent, and so we had to kind of stumble into it. That’s how it was being formed. And it’s just fascinating to be many years later, decades later, and programs like yours and so many more, and yet, you’re characterizing a bunch of people with amazing skills and oriented towards different work and kind of transitioning into this. Yes, still what’s going on. I don’t know if that may be how careers happen. Regardless of what discipline or business you’re in, but I guess I always thought it was gonna change. And it’s just so curious that we still have this happening now in this field,

    Amber: at least from what I can see, like from the, you know, folks on my team and other folks I’ve interviewed recently, I think having that other kind of background or perspective, as gives them an upper hand in some ways, because they bring a whole nother set of skills and perspective into the work they’re doing. I’ve just found, yeah, just the way that people can craft those stories or the experiences that they can share, help build like a bigger picture of just different perspectives, kind of bring them to this work and help them have kind of a point of view around certain things that is very interesting,

    Steve: if you’re saying this explicitly, but I’m thinking as part of a team when you have a range of other contexts or other skill sets that people bring in even more so even better, because you have some just diversity of frameworks and superpowers.

    Amber: Yeah, absolutely. It’s, it’s awesome because the other team can build on you Other, they have these different experiences. And right now I’m actually like the next kind of skill set I’m hoping to bring in is somebody who has pretty solid background and mixed methods. So bringing in more of the quantitative, somebody who’s done a lot more quantitative work, but still really understands the qualitative approach as well, because I think that’s an area I don’t have a lot of expertise in on quantitative research. And so I’m hoping to continue to add to the team in that way.

    Steve: So just a slight topic shift here and pulling in some of the things that you’ve talked about when you’re kind of giving the story of different roles that you’d had you described at one point moving into, I think it was a project 202 like moving into leading a team, and then that being, I think, a defining characteristic of the work you’d been doing then and since then, research, leadership is new ish, relatively speaking, it was kind of the impetus for me in this podcast, like, Hey, here’s kind of a new role that we just didn’t have. Again, if you go back a number of years, do you have, you know, a perspective on what research leadership Looks like is it different than other kinds of leadership?

    Amber: Yeah, I don’t know that it’s different than other kinds of leadership. I think there’s all the similar aspects, right? You’re trying to sort of set a vision of you know, who you are and what value you provide. You’re working with folks who I love being a manager, I love understanding people’s goals and their career paths and the things that they’re trying to do, and figuring out how do we match that with the needs of the organization? How do we help someone sort of grow and get the opportunities that they want, at the same time that we’re providing, you know, the value that we need to provide to a company. And so that’s one of my favorite things. And then thinking about, I guess, as a leader, you need to really, again, understand the business understand and keep in contact with a lot of different folks know how to grow your team, how to think about, I guess one of the big decisions with research that might be a little bit different, but is similar to things like data To science or QA, it’s like you can have a centralized team or you can have a distributed team. There’s something about how you organize yourself and how you work with parts of the organization that the leader really has to have a vision around that. And the kind of, again, I would call it like principles, but like, what are the values or principles for how your team is going to operate? So like, one of the things we talk about a lot is we don’t want anyone to have any barriers to interact with us. So we don’t require you know, somebody to go through a forum or put in a request or anything like that. We do office hours, you can ping us on slack. You can, you know, there’s all these different ways that you can interact with us. So I think kind of setting those values for how your group operates is really important.

    Steve: Given that you’re someone who’s done research many points in your career, does that influence or help define the way that you approach being a manager for research?

    Amber: That’s a good question. I mean, I think just trying to About I don’t know if this is a good example. But when we were a little bit over a year ago, like helping figure out what we wanted to do our goals for the next year, this was actually driven by one of the researchers, Liz, she created a survey and went and did you know, one on one interviews with everybody on the team? So we actually kind of researched the situation that was happening in order to determine let’s say, do we need to provide more education around, you know, writing a discussion guide or something like that. So we kind of used our own approach of research to understand our own situation, come up with ideas and prioritize them. I guess that that skill of listening and talking to people and being able to analyze a lot of information and synthesize it is a skill set that I find valuable every day. The ability to kind of like take that synthesis and make a plan, organize it and prioritize things has been very handy as a leader.

    Steve: So there’s an example of superpowers that researchers have that you have given your career that influence support your management work. Do you have other superpowers? Or things in your background that maybe that we haven’t talked about that you find yourself drawing from?

    Amber: It’s a good question. I don’t know if it’s from my background, or just like, my personality or what exactly I just say, like being able to simplify complexity into really clear communications, whether that’s written or you know, a slide deck or whatever it is, is something that I’ve honed over time, right? It’s it’s something that I especially like, being a consultant, and then you know, research like how do you how do you take all this information and how do you put something in front of somebody that they’re going to be able to remember or respond to, or, you know, kind of take in? So thinking about that. And then I think to just like, researchers have to have to have a plan. Right, like you’re going in to do research. You have to, there’s a lot of like logistical things. There’s a lot of strategic things too. And I think being able to both think high level and also think, you know, tactically are really important. So, okay, we need to get you know, in in six months, we need to understand the answer to this huge question. How are we going to get there? I think researchers, you know, have to figure out how they’re going to get to that end goal. designs the same way, right, you have to kind of break things down into manageable chunks and kind of make this plan. And I think that’s, that’s a really good skill, like, a lot of people, I would say a lot of people but I think some people have, you know, visions of something they want to get to, but don’t have a way to kind of break it down and figure out how to move to that vision. And I think researchers and designers are skilled in that.

    Steve: It just makes me wonder, you know, if we sort of go see researchers and they’re off work lives, like what kinds of life skills what do we see or what kinds of applications so some of these researcher strengths would we see in the non work parts people’s lives. Because I think you’re right that we’ve talked about things like storytelling and complexity and planning and just to those things manifest, or do we shut those things off? Or are there other things that kind of come to the fore when we’re quote on the job?

    Amber: I mean, I guess the other thing I see from a lot of researchers and I love is just their curiosity in other areas. So what obscure thing, are they off kind of looking at reading about learning about that they then, you know, take something from and bring it back into their work. Like, I always find that fascinating. I think that’s what I loved about school, you know, grad school, undergrad, where you’re learning about all these different subjects and all of a sudden you find that something about acupuncture and the philosophy like applies to a design project that you’re doing. So I think, you know, having those curiosities and finding those connections are really important.

    Steve: So there’s anything else that you think we should talk about that ask you about?

    Amber: I can’t think of anything to be honest. I think that was really fun to to share all those different things. I’m trying to think if there’s anything else,

    Steve: I’ll throw another one at you then. Okay. When you think about whether it’s the field or you and your team at duo, what do you sort of look forward to in the future? Where do you think things are heading? Maybe if we say five years, do you have any vision or anticipation for where we might be at that point?

    Amber: Wow, five years is a long time. I don’t know, I find those questions really, really challenging, I guess, thinking about where my team at is, that is one thing or just the field in general. You know, I do see you know, there’s so there’s so many evolutions around different aspects of technology that are happening right now that I don’t think we quite know how they’re going to play out, you know, around like, augmented reality and voice assisted technology and all these things that I don’t think we’ve quite embedded in how we do our work. Like I don’t know if there’s going to be some Different ways of us interacting with people. I mean, just even thinking about the current climate right now, you know, we’re trying to figure out how to gain contextual information when we cannot be in context with people. So is there a way to get that kind of closeness or realness with people in a way that is different than being on site with someone or different than, you know, like a diary study tool or something like that? I don’t know what that might look like. Cleaning. That’s one thing. I think I’m also, you know, there’s a lot of folks who practice like mixed methods, that’s still an area that I feel like we still have quite a separation between qualitative and quantitative specialties and not quite figuring out how to bridge that gap. So that’s something that I want to watch for in the future, like how do we really take advantage of lots of data that we have access to that could help us make decisions and learn to think what else? I don’t know. I don’t know if those are very good answers, but

    Steve: they’re great. answers because my question was a terrible one. It said, what’s the future going to be? And you said, Well, here’s what I’m going to be looking for, which is a good researcher response. Yeah. here’s, here’s the signals I’ll be paying attention to.

    Amber: It’s interesting. I really, I just, I don’t know. I mean, I think research is so needed, I hope that we evolve where we are not. There’s always there’s kind of like, you know, UX research. And then like my team, we call it design research. There’s research that’s, you know, in the service of sort of like product UI versus like research that’s, you know, much more looking at, you know, the customer journey and the customer lifecycle more holistically. And there’s a lot of areas that have been developed in other countries, I would say like in service design aspects in like the UK and Australia, New Zealand that just aren’t quite here in the US. So that’s something that, you know, I hope that researchers are included more or we we get that influence in other areas like public works. Like government departments that really impact people’s lives. Like I think there’s a whole spectrum of things that researchers could have a big impact on that we’re just not. There’s just those areas just aren’t really built up here yet.

    Steve: So it’s it’s a demand issue, not a supply issue. Yeah,

    Amber: for sure. Yeah.

    Steve: This has been very interesting and lots to think about. I just really appreciate you sharing so much about, you know, your own path and all the great work that you’ve been doing it to security and thanks again for being on the podcast.

    Amber: Thank you, Steve. I enjoyed the conversation as always, and guys record to next time we get to catch up.

    The post 34. Amber Lindholm of Duo Security first appeared on Portigal Consulting.
    5 May 2020, 3:21 am
  • More Episodes? Get the App
© MoonFM 2024. All rights reserved.