OPERATOR: All Attendees are in listen only mode. RENEE WYCKOFF: Hi, good afternoon, I'm Renee Wyckoff, research design engineer at L&E research. We're so glad you could join us today for the next webinar in our series, Webinars Hosted by L&E in 2019. Our webinar series focuses on various relevant topics in the qualitative research world. Today's webinar will focus specifically on emerging methods in qualitative research technology. Our webinar series ties in with our white paper series as well, so we'll be sending out a copy of the white paper in addition to the recording of this webinar, should you be interested in revisiting or sharing with colleagues that couldn't attend today. Before we got, get started, I just wanted to mention that we'll be taking questions and answering at the end of the discussion, so feel free to send in any questions as the webinar's going on, but we won't actually be going through the questions until the end. So let's get things kicked off with our two guest speakers. Today we have Charlie Rader, a digital insights designer at Procter & Gamble. And we have Lenny Murphy, executive editor and producer at GreenBook. Charlie, why don't we get started with a little bit about you? CHARLIE RADER: Sure. Good afternoon. What I do for P&G, as I'm approaching 25 years, has changed over the years, but currently this area of what we call products research testing is what we do. And products research is this kind of a funny hybrid between science and marketing. And what we do, in comparison to some of our friends in consumer market and knowledge, we look at the product consumer experience, product package experience. So I have a focus in on product innovation. And as I've mentioned, I've been doing this for about 25 years and have been working in the online qual technology space since about 2008. So I've seen a bunch of stuff, I've helped build a bunch of stuff, and so it's fun to actually have a change to chat with my friend, Lenny Murphy, and get where things are changing and evolving. So I certainly value Lenny being on the call with me today as well. LENNY MURPHY: Thanks Charlie, the check's in the mail, my friend. Check's in the mail. And I'm Lenny Murphy and I don't have nearly the focused bio that Charlie does, but for those who don't know me from all the annoying emails you probably receive that have my name on them, across lots of different things. I've been in the industry for, coming up on 20 years. The first ten of that in running and building research businesses, the last ten of that, helping other people run and build research businesses and in a very visible way through my role as editor and chief and producer of GreenBook. And I [AUDIO SKIPS] that series and am also a partner in Gen2 Advisors, which is our consulting company. I sit on lots of boards, I help lots of people who think that I know what I'm talking about, so I guess I've done a good job of convincing them that I at least sound smart, you guys will be the judge on rather I do that again today or not. RENEE WYCKOFF: Thanks, Lenny. All right, so thanks guys. Let's just get right into it. So with all of the new emerging technologies that can be utilized for qual now, in each of your personal experiences, which one do you feel like is – do you feel is currently making the most impact? Charlie, let's start with you. CHARLIE RADER: Sure. Where I think that we're having the most impact is really some of the – and I'll bring the buzz words of AI, and in this case I mean artificial intelligence, not adult incontinence, which is what I normally work with. So. LENNY MURPHY: Sorry. Go ahead, Charlie. CHARLIE RADER: Chuckle chuckle, yes. So I work in, my specific responsibilities are in our FemCare organization, so that's pads, tampons, and adult incontinence for Always and Tampax brands and all these other good brands like Elle and True and Just and all those things. But where I see the technology really evolving is, or at least certainly has promise – and I think we'll be discussing a little bit of that today – is that of artificial intelligence in helping to surface the insights that can be generated out of online qual. I mean, one of the things that I really see is that you can have in a – rather that's a bulletin board use of a platform or something like that, you can certainly generate a lot of very rich data in a very short amount of time. And, then, the researcher is left with this ocean of data that needs really good tools to make it useful. And those are the pieces that I'm looking at and investigating. So when I went to IIeX this past spring, that's where I was looking, looking for an eye to determine where are the toolsets to improve the speed and insight generation that we need. LENNY MURPHY: Yep, I would tend to agree, the – maybe a little more nuance. I think that AI/automation has unlocked massive efficiencies in terms of cost and speed, not just in analytics, that is absolutely probably the most prevalent way to do it, but also in data collection, we're seeing companies like Remesh, for instance, in their usage of AI via an online chat capability, where AI is actually helping to drive the conversation. Or from that standpoint, companies like Rival that are using AI in a chatbot scenario. And we're also seeing the efficiencies bot, in terms of sampling and recruiting. So it's kind of across the board we've had these, we saw the same basic trend occur in quant, and now we're seeing it apply in qual. And, then, not just online, online is easiest, it's easiest to embed it in. But I think, particularly, Charlie, as you were talking about the analytics side, those can be applied across any channel of qualitative, whether it's live or virtual, as long as the data is being collected in a way where it can be applied from an analytical standpoint. But it's very exciting stuff. CHARLIE RADER: Absolutely. I mean, the fact is that we do plenty of in person quals and, for the most part, we record those qualitative interviews and then we could utilize those analytical tools to make that work even faster. So even if it's not collected from a VoxPopMe or a Discuss.io or Revelation qual board from 2020, the, all these platforms that can – that are what I'd say more directed ethnography or self ethnography types of platforms, you can even take in the more generic one-on-one interview. LENNY MURPHY: Yep. Well, and that's an interesting point because it brings, all of this is really dependent upon video, right? So that's the operative piece here. And that was unlocked because of the ubiquity of cameras in every device we have. So we carry them in our pockets 24/7, so that ability to reach scale, kind of mass ethnography, is interesting as well. There's a company out of – well, they're actually a spinoff from AB InBev right now where they build internal capability to, basically, empanel people to walk around with a GoPro, for all intents and purposes, livestreaming their lives for insights. They're spinning that off now as a research company. And it's ingesting all of this video, but it's powered by all the AI on the back end, for the analytics to actually make it useful. CHARLIE RADER: No, that's very interesting. I mean, maybe not as super applicable to our product categories for privacy purposes and whatnot, but to get day in the life of would be a huge kind of way to get at that if you're just wearing a recording device all day long. LENNY MURPHY: Yeah. A little scary though. CHARLIE RADER: And you know – yeah. Well, I mean, and to the question is then, how do you search and shift through 24/7 for maybe a dozen days or so across two dozen panelists, then where do you find the interesting bits? LENNY MURPHY: Yeah, well, and that's, I think that's the point of the AI and automation. Even though within focus groups, it's easier to tag in a smaller sample set of what's there. We know it's targeted toward specific topic, etc. But in this case, that's what they're building, is that ability to recognize images in the video. There's a brand, there's a logo, and to correlate that with this massive set of tagging over and over and over again, which we know already happens in social media analytics. I mean, Facebook's been doing this for years with still images, now we're just seeing that applied in a slightly more focused way. And, Renee, you better take over, because Charlie and I get, we can get way down these rabbit holes, so. RENEE WYCKOFF: [CROSSTALK] No, you're good. I want to know though. So let's talk just a little bit about, a little bit more about video. We talked about it last year and how video was kind of that, not such a new technology, but it was the one that was, that we talked about making the most impact. So other than, obviously, being able to take a peak into the lives of consumers and how they use products and services and such, what are some of the other ways that video can be applied to qual? LENNY MURPHY: Charlie can tell about what you guys do, right? CHARLIE RADER: Yeah, sure. Absolutely. And the fact is that beyond what I've called the talking head videos of focus groups and IDIs – and my preference is IDIs, for those that are interested – that video gives us the ability to do behavioral observation. And, so, I think that's probably the next, the next place of understanding consumer behavior as the, as the true fly in the wall ethnographer to have a camera in place or to see what the behaviors are so that you can say, that's how they're doing that. And what are the implications for new products and services if that behavior is impacting product performance or something like that. So is it education or is it new features that need to be added to try and get at what folks really want. So I'd say, I'd say that aspect of looking at true behavior, which is the non-verbals as we kind of head toward, whether you want to call it neuromarketing behavior or behavior science, all those kind of things – we call it behavior science at P&G. But these behavior science questions to see what people are doing is a lot more important than what people are saying. RENEE WYCKOFF: Charlie, can you kind of walk us through what that might look like for P&G? How would you use that? Walk us through a scenario of what that would look like. CHARLIE RADER: Well, I think we can just talk about – in other aspects of, maybe, putting a camera at the kitchen sink and recording hand washing of dishes and whatnot. And, so, noting how often dishes that, one, how often, maybe, the sink is filled. So getting away from recall base diaries to, this is the observable truth here. How many times do you fill the sink? Well, do I always fill the sink, do I sometimes fill the sink, do I sometimes fill the sink? And anything between always and never is an estimate. So using video is certainly a way to say, what's the truth in this spot? Maybe use it in ways of checking panelist compliance to instructions for clinical type of testing, which is really important to show the kind of work that our products do, as well as provide the support for the claims that we make in market. So being able to utilize video observation has a wide variety of purposes. How many strokes of a razor goes up versus how many strokes of the razor goes down versus how often the back teeth are brushed in a, with a manual toothbrush or a power toothbrush. Those kinds of things, video, I think, has the potential to get at. The biggest question is, with the reduction of the amount of people that can be focused on these kinds of things, how can the technology identify those behaviors, tag those behaviors appropriately without having a researcher go through and do a lot of manual work in that space. I think those are the places where video, actually, kind of spans between qual and quant. At that point you're starting to count and do that in a much more quantitative way, but at its basis, we're looking at human observational behavior. LENNY MURPHY: And add to that little bit more, Renee. Just outside of just the product in-home stuff, but think about the applications in-store or point of experience, right? The ability to do a shelf test in the store. You don't have to set up anything virtually or have – in a lab, we, that happens there – or in-store observations coming from video mining that has cameras set up in, literally, thousands of stores across the U. S. And they record traffic patterns and they look at where people are going within a store, specifically, and where they stopped on the shelf. And how long do they stay in front of the shelf? And what's the assortment look like. So that unlocks – to Charlie's point – that understanding the real truth of experience as it's happening, so it's payroll. And, also to Charlie's point, I don't know of any video analytics solution that doesn't incorporate some level of, at least, facial coding, if not other types of behavioral science as well. So it is, it's not just the understanding of what is occurring, but the why it's occurring as well, unashamed emotional valence during the point of interaction or occurrence. And we're talking about video, but the same thing applies to images as well, images are more limited, but all of this also can address, can be applied just to a still picture. CHARLIE RADER: Yeah, I mean, the fact is that when it comes to qual, there's three basic data types. There's text, so if you're doing some sort of a blogging thing. There is images is you're doing a photo scavenger hunt around the house for where things are stored or what your favorite objects are. And, then, of course, there's video. And while the low-hanging fruit is text analytics, our ability to grow in artificial intelligence models is starting to really unlock the image and video space. LENNY MURPHY: Yep, absolutely. I was just on the – before this, I was on a call with LivingLens, which Renee, I think, are one of your partners. RENEE WYCKOFF: Yeah, they are. LENNY MURPHY: And having this conversation, discussing the roadmap for the next iteration of the platform and the – and it was all about AI and automation and analytics and how do they expand on that to make it more user friendly and to meet the needs of clients like Charlie. Because that's the other thing, is when we're talking about video, we're also talking about a ton of data, just from a size standpoint. So this is, this dwarfs our traditional kind of quantitative aspects of data. Now, there may be higher volume of some types of data that we would term quantitative, but just the size of each of each of these elements, from a video standpoint, is just massive. So the, it does require a significant amount of infrastructure and investment to be able just to manage this information, to curate it before you even get to the analytics, just to manage the raw content itself and get it to a place where it's usable, or else it's just this massive library of stuff just sitting there. It's really difficult to manage. Charlie, I know you've been working on that for years. I remember, what, gosh, almost ten years ago P&G had an RFP, or was looking at an RFP to build an internal video curation system, just because it was such a challenge. That was ten years ago. CHARLIE RADER: Yep, and we built, maybe, an internal system that doesn't have what I'd call the AI components of some of the, some of the offerings that are out there today, things like – we've been looking at ViewPlus and LivingLens and QualSight as some of the things that we've been looking at, as well as our friends at FocusVision. But the fact is that our system was based off of just the basics of the text analytics and being able to say, all right, how can we have a repository of our video qualitatives that, one, we can mine and learn from the consumers on the specific project. But, then, how do we make that even more broadly applicable? And we start having our researchers look at what we've already done, creating the library, the repository, in a useful way to, at least, any some, maybe, low level questions. I mean, everybody has their own specific question that needs to be answered, but at least we may not have to do as much of the going over the same amount of old questions that we've had previously. I mean, if we're going to spend the money on – especially talking with real people and not bots, that's a different seminar. But if we're gonna spend the money to talk with people face-to-face and record them in some way, being able to utilize that data on a longer time frame, it's certainly useful. And, yeah, it's something of how do we best utilize that data? Now, I think there's a shelf life to that, but I don't know, maybe, Lenny, what would you say the shelf life for something like this would be? Somewhere two years, five years, six months? LENNY MURPHY: Well, it, and there's two ways to answer the question. So one is that, these one is how relevant is the information? And I think that's, probably, category specific. If it's a – I'm such a creature of habit, when I find a product I like, I buy that product over and over and over again. It really doesn't matter whether it's cheaper or whatever, it's like I've – yeah, this is the toothpaste I use. So those things tied to other data that validates whether it's, how, what's the frequency of switch, etc., probably can give it fairly extensive. But the other, more interesting question, and I think we touched on this, so we can go back to it rather than get there for now, but is from a data privacy standpoint, this may become a very different conversation on what is that shelf life. CHARLIE RADER: Yeah. Absolutely. And I know when we were building our internal solution, our privacy folks were saying, so how long – when do you purchase data? And the scientist in me says, I don't want to ever do that. And the fact is that, not only with GDPR, but with the new California regulations, we're gonna have to look at what is the ways that we handle video? Because anytime you have a person's face on it, you have PII there, so. LENNY MURPHY: Yep. CHARLIE RADER: There's a lot of work that's going on inside our company about doing the right thing in privacy and doing PII minimization and elimination, but when it comes to qual, it's real people all the time. It's at an individual level, so it has to be fiercely man [AUDIO SKIPS]. LENNY MURPHY: And with video, specifically. So, Renee, I think we have a question about this later on, so I don't know if we want to keep going down this path or you want to circle back around. RENEE WYCKOFF: Yeah, let's move on. No, this is wonderful. Let's move on, let's break that up a little bit so we can get on to it later. So with the new technologies that are out there, there are some – we've all been to conferences where we have somebody that has a booth set up and you don't, really, necessarily know how this technology is going to apply to qual or even quant, for that matter. In your opinion, which new technologies, or which technologies are out there right now, that aren't quite being used for qual yet that you feel, in the future, could come into play and make an impact? LENNY MURPHY: Can I take a shot at that first, Charlie? CHARLIE RADER: Absolutely. Yeah. Apology. LENNY MURPHY: This is the – yeah, this is the stuff that gets my geek blood going, right? There's three that I've been paying attention to that I think directly have an application from methodological standpoint. One is augmented reality, the second is virtual reality, and then the third is 3D printing. I'll run through real quick each one. I mean, augmented reality, think of it, the Pokémon Go, or the Harry Potter game. So we see a lot in games, but effectively it's an overlay over your experience using a camera. So you're in-store and it's video on steroids, so there is an overlay that can then ask you questions based upon what you are observing in real time, so that's AR. And it does require goggles or glasses and it's basically just whatever device you're using that has a camera. VR is the fully immersive stuff, actually I just saw a press release yesterday from 20|20, our friends at 20|20, Isaac, they're really cool at experimenting with fringy stuff. So they're building some adaptive VR simulations technology to adapt in real time to what you're – to a shelf test, for instance, environmentally. That's super exciting. That's pretty cool stuff. And, then, 3D printing, the ability to iterate with products rapidly, rapid prototyping right within a store or in a qual facility or now even in a home. Those technologies are becoming ubiquitous, and most importantly, they're becoming cheap. So when that happens, then we can get scale. And I think there's really interesting stuff that can happen when we're deploying those types of things. So, all right, geekiness done. Charlie, you can whether you agree or disagree. CHARLIE RADER: Well, I mean, I think the – we use plenty of 3D printing in P&G. I wonder about how, what the turn around time is on scale and whatnot, you can create small things pretty fast. The full scale stuff, I don't know. I've seem some of these really interesting 3D printers that you see stuff basically emerge out of this resin liquid. I'm like, well, that goes pretty darn fast. That may be something that has some life there. Because we do need to – we are humans, and in a qual space, and humans need to react to physical things in lots of ways. What I do like, especially out of the geeky nature that you mentioned is the augmented reality piece, and that having an additional, we'll call it a data layer on what you're seeing, whether that's gamify, such as the Harry Potter game or Pokémon Go, I think the fact is that having the ability to – and I've seen prototypes that we've tried many years ago, even. But seeing it fully commercialized in lots of different ways isn't quite there. But being able to pick up a package, maybe scan a QR code and then it can reveal more about the package, the product that's inside there, I think, is really some, has great potential in helping our consumers show product superiority, why they should choose our products. But just showing the fact that people are expecting more from our products these days in a, in our digital economy. So what's on the outside of the box doesn't always tell the full story. And people are even going, they'll scan and go and look at Amazon reviews right off the store shelf as well. So it's one of those things where if we can apply digital information for what, when people need it at that moment of truth, that first moment of truth especially, I think we have great ways of winning with our products. LENNY MURPHY: And what's really gonna unlock all this as well is, now, the introduction of 5G. We've had limits from a – it comes down to speed, cost, and scale, and now we're dealing with the speed issues because of 5G. So as that's being implemented, we'll have wireless bandwidth that is just as good as broadband. And that opens the door to so much of what you're describing. RENEE WYCKOFF: Got it guys, thanks. So one more thing, kind of a little spinoff to that, with new technologies, and especially for researchers, what one technology, or even two technologies, that are out there do you not see researchers using enough of? What should they be doing? LENNY MURPHY: Not using? RENEE WYCKOFF: Right. What should they be using more of to – yeah. LENNY MURPHY: Got it. Charlie, what's your? CHARLIE RADER: Yeah, I mean, so I'll get back on the video analytics bandwagon again. We talk with, talk with so many consumers and we often record them for the just in case moment where they have the gem of a quote or something like that that can really help the report. But the fact is that we leave so much of the value of those interviews behind if we don't take that, transform that in with video analytics to look beyond, where do they say that one word, to really creating a whole body of evidence about that research question. And I think video analytics does something that – I mean, it's not perfect in this way – that it can provide objectivity to a very subjective process that we've had in the past. And the fact that when we as humans, we live in a timestream. We live in the morning to noon to night to bed time, so we'll set up a series of qualitative interviews and we have these cognitive biases where we'll remember the first one, we'll remember the last one, we'll remember the weird one, so and the boss usually takes the weird one. So being able to use video analytics really can provide objectivity across that. Because in most intents and purposes, unless it's a very widely accepted belief, if there's any controversy to it, having multiple consumers spit back what that insight was time after time is really what can drive action on projects and gets, shows the heart of the consumer to the – and getting it passed the head of the manager, sometimes. Because sometimes we don't like to hear that things aren't working well all the time, especially as I live in an R&D land, we have, we are doing prototypes all the time. So we may be hitting one of the five things we were shooting on, but we need to be able to tell the truth to our project teams so that we get superior products. LENNY MURPHY: Yeah, and I totally agree with everything you said, and then would add to the – although in most of the cases, video, some level of behavioral analytics is embedded in as well. But there is distinct use cases for non-conscious measurement in qualitative and quantitative across the board as well and it's growing. And, again, it's an issue of scale and cost and technology solving those problems now. So there's – one of the companies that I work with – and full disclosure, I'm on their board – CoolTool, they've created effectively kind of a zappy type of solution for non-conscious measurement. So it's cheap and fast and good. And we're seeing more and more of those technologies emerge so that they can be deployed across the board to, not just understand the who, what, when, where, how that, but also get to the why. And, Charlie, I'm sure that you would agree that that's a massive piece of information that's necessary, especially when you're trying to understand past experience or just really get to the truth of the interaction with the brand or the product to get that emotional valence. CHARLIE RADER: Absolutely. And I think the piece that I reinforce isn't the, isn't, maybe, necessarily emotion, but it is to find out where, what lies beyond the words that people say and looking towards these behavioral metrics to see when do things line up, when don't things line up, being able to utilize these more behavioral metrics to really at the heart of that. Because there is a lot that – people, because we're social creatures, we don't want to hurt other people. We certainly want to appear competent and smart and don't want to show that there's confusion or anything like that. So sometimes the words in the qualitative interview is that, it's great. But the eyes behind, they're like, my gosh, what did I do here? So looking at, looking beyond the words is, I think, what those technologies really can bring to us. RENEE WYCKOFF: Got it, thanks. So last time we spoke, guys, we talked just a little bit about video analytics and how it's not really being used for qual, as far as the end result. Can we talk a little bit about that, as far as using video analytics to analyze the qual recordings that come out of sessions or focus groups? LENNY MURPHY: Yeah, I mean, we're there. So, actually, it's a great reminder. I think last year we weren't seeing quite that uptake and I think it's grown pretty massively in the pass year. I mean, look at one of the leaders in our space, VoxPopMe, they disclosed, what, $8 million round. That speaks highly of the bullishness of the market on the application in the use cases. So and we've always had FocusVision around to record and stream video in traditional groups. And there's just 100 different platforms out there now that leverage video from a collection and from an analytics standpoint, and, probably, 50 of them weren't here two years ago. So I think we're seeing that explosion across the board, although what's interesting is that when we look at methodological usage, in person focus groups is still by far the dominate qualitative method, nothing else even comes close. The virtual qual – and I was doing online focus groups using Adobe Connect 15 years ago. But they have it, they haven't really gotten there, at the level I expected them to, it's getting there now, finally. But there is still a massive gap between online focus groups and traditional focus groups, and just in that environment where we're seeing the uptake isn't really in the focus group paradigm, it's with IDIs, ethnography, those type of things. The focus group paradigm is still, still kind of stands supreme. RENEE WYCKOFF: So you don't, you don't think it's going away? Because there are some people that will say that it is, that it's not [CROSSTALK] LENNY MURPHY: I do not. And ten years. CHARLIE RADER: Well, I'd like to say. [CROSSTALK] LENNY MURPHY: Ago I was one of them. I was saying, it's got to go away. Why the hell it's not going away? Come on. But, no, it shows no signs. I was wrong and I continue to be wrong, it is not going to go away. And I'm sorry, Charlie, go ahead. CHARLIE RADER: No, No, I mean, absolutely right. The fact is that there are a lot of people doing focus groups. I would say that as a qual researcher the answer is, should you be doing a focus group? What are all the reasons for doing that methodology versus other types of qual methodologies because of all the other issues that go along with it. I mean, a focus group has been around since, I think, the '40s or something like that. But the question is that, I think a lot of folks use the, raise your hands around the table kind of thing, where it's now they're looking at how many votes that they're getting out of this particular group. And, frankly, that's something that, it – you're just doing underpowered quant at that point in time. So if you're not really using a group to build and grow something, that's why I'm, have been a proponent of going to the IDI. But at least you could use some level of interaction at duos or trios in a much better fashion than what I think with six to eight to ten people around a table, and you just have more social interactions that a moderator has to work at defeating. So I guess I'll get off my soapbox on that one, but you're absolutely right. There's a ton of it being done and, frankly, it will continue to stymie these automated solutions simply because of the amount of talking over that occurs in a lot of these sessions. LENNY MURPHY: Yeah, and I think that there's – so video and bandwidth, I think, is a big part of this conversation. Because, what, you've always heard for years and, with – well, I don't want to do an online group because I need to be in the room, or at least observing the room, to catch the, to catch the understanding, the nuance, that gut of what people are thinking and feeling. And totally get it, I understand that. And there were limits at that point in terms of bandwidth for streaming, it was clunky and glitchy, etc. I think those things have gone forward toward being addressed so that it is a more seamless experience. And with the addition of AI and analytics and behavioral science, it's still not a replacement for a human interaction. Me looking you in the eye gives me something very distinct and instant as a human that I'm not gonna get anywhere else, but I can get pretty damn close now with technology that can tell me, I can look at it. Charlie looks frustrated, and the analytics say, yes, Charlie looks frustrated. So that's a big part of it. Yeah, technology has just needed to catch up. And, so, we have that. Now, where the speed and cost efficiencies are there, and the seamlessness is there, but I think we'll see more uptake. But, there's always going to be – plus, people just like eating M&Ms in the back room. There's that piece of things as well. RENEE WYCKOFF: I love the catering that Melanie does for us. So, thank you very much. LENNY MURPHY: I miss those groups. I haven't done one in a long time. Now it makes me – I need to go do one. Just to get out of the office. RENEE WYCKOFF: So, it sounds to me that we all agree that in-person or in-person focus groups aren't going away. That the technologies that are out now, all the new technologies can just be used to augment it. LENNY MURPHY: I think so. And Charlie made a good point about the right use of a focus group versus another methodology. An IDI, or some type of ethnographic approach. And I think that's really where technology has driven greater scale and adoption of those things. But the group still performs a very distinct and important function in the research process. RENEE WYCKOFF: Truth. Moving on. We know that we have – we just touched on using video analytics or analytical tools to process what comes out of a focus group. What are some technologies that are changing the way we analyze the data on the backend? Can either one of you speak to that? LENNY MURPHY: I think we did. Kind of. I'm sorry that sounds snarky, I didn't mean it that way, Renee. RENEE WYCKOFF: It's not – it's not snarky. LENNY MURPHY: Yeah, but I think the answer is the same. It’s the use of unstructured data. The analysis of unstructured data. Inherently, we're talking about unstructured data here. So, text analytics, video analytics, emotional analytics, image recognition. These are technologies that have existed for a while. They're deployed massively within government. Probably more than any of us would even really want to recognize. And now they're being deployed within the business context. I think those are the big buckets. And content curation, and those type of things that help manage all of that. But it's effectively all kind of drills down to the analysis of unstructured data. It's just the question of what is the data type. Or focus of the analysis. Charlie? Is that oversimplification? CHARLIE RADER: Yeah, I'm kind of giving you the backroom amen corner on that. The data types of qual are unstructured, generally. And so where we are maybe looking for in the future, and certainly if it's not available commercially in any of the number of platforms that we've name-dropped today, we are working as a organization, and I know that Kirti Singh, chief of our CMK function spoke at IIeX in Austin, and we're looking at what we call eyes to the consumer, hands on the keyboard. That we're looking to be more hands-on with our data. And so that also means that we are working to become much more proficient in understanding the technologies of how we create these datasets, and then how are we analyzing those datasets. And that can include – there's a massive move to learn how to do greater data science with things like Python, or NIME, or some of these nontraditional data analytics tools. So, we're getting out of the SBSS world, so to speak. But how are we looking at these kinds of larger and larger datasets because, whether that's sensors or video, or sensors Internet of Things kind of thing. Bu being able to look at behavior. But we have these giant datasets that are collecting data at 30 frames per second if we're talking about video. Or 60 frames per second if we're talking about really good video. LENNY MURPHY: Yeah, my oldest daughter just started grad school at Georgia Tech, so I've got to brag on her for a second. And she had a degree in anthropology as an undergrad, and now she's going into public policy and urban planning, and she's taking a statistics class, and she has to learn R. Not SPSS. It's R. That's what they're using within the statistics program, and looking at lots of different data, right. Since she's looking at more urban planning, a lot of it is sensor data. And a lot of it is video data from observation cameras, etc., etc. And deploying those technologies herself, it's fun to hear my daughter talking about, I'm trying to learn Program R. But that's, to your point, Charlie, yes, I think that's a trend across the board. RENEE WYCKOFF: Yeah, that question might have come out the wrong way, but – so in terms of deliverables, things that we can give our client on the backend, what are some things that we can show them now that we weren't able to, say, ten years ago? LENNY MURPHY: Snips within reports. I think that's something that's pretty standard for you guys now, right Charlie? CHARLIE RADER: Yeah. I mean I wish that folks would make better use of that, but yeah, absolutely. Having – like I said, having the voice of the consumer in the actual room where decisions are being made. And the standard PowerPoint deck isn't doing it. And so getting to the heart of the matter with video clips is key. Being able to show how people are actually working, is part of that. And so whether that's the words themselves, with the emotion that follows behind that, or that's the – oh, we hadn't seen them do that thing before, is, I think, super powerful in a reporting fashion. And how do we construct easy to use, and easy to share playlists of these qualitatives, is really super powerful in getting that voice of the consumer into the decision-making. LENNY MURPHY: Renee, I know we're getting close to the Q&A, but I think it's important to sort of background of that thing we mentioned around data privacy. RENEE WYCKOFF: Definitely. LENNY MURPHY: Real quick, before we dive into that. Because that is – so before the call, Charlie, I mentioned that you and I had talked to a bunch of folks this week, based on our conversation before this. You had said something around where P&G was thinking – their concerns around data privacy in video. Can you – are you comfortable repeating that? CHARLIE RADER: Sure. And I mean, I wouldn't say it's proprietary information to say, you know, with the new California regulation about to go live in January, and certainly, a lot of pain has been managed with the GDPR regulations in Europe. The fact is is that we've been talking about how video is now privacy, so how do we then manage the video data? So, I mean certainly where is the raw video being kept is one piece of it. But then, as we go to reporting, how are all those snippets being tracked out? And I think that's a question of it will happen. Somebody will say, well, I want my data back now that this law is in there. Whether, for whatever reason, that the consumer wants the data back now. How do we have systems in place to track that, to follow that all the way down the line? And then say, all right, we know where this person is across the interviews that they've done? And then be able to extract that. I think that's a huge challenge that we're certainly working on doing. But, Lenny, I think you have some proposed solution to help us do that. LENNY MURPHY: Another check in the mail, Charlie. Yeah, I won't press that too much, but anybody who knows me, and pays any attention, they know that – Greenbook invested in a start-up called Bearcliff [ph], that is focused on this idea of a compliance layer and a data marketplace. But that compliance layer is probably the biggest part. It doesn't matter whether it's Bearcliff or anything else. I totally agree with you that this issue around managing consumer data in a transparent, ethical, and traceable way, that's permissioned, is a bit deal. Period. End of story. And many times the consumer data, especially with California, because, no offense, but California is – they're very litigious – I can't even say the word. RENEE WYCKOFF: Litigious. LENNY MURPHY: They're going to teach people to do this. RENEE WYCKOFF: You have a big target. LENNY MURPHY: Yes. Right. Any brand is a big target. So, this – so I think we, as an industry, we've always been cognizant of data privacy in some ways. But this is a whole new ballgame. This isn't about anonymity anymore. This is about a massive compliance framework to create full transparency to the consumer and full access to the who, what, when, where, how, and why that their data's being used. And the ability, to your point, of saying, I want it back. And you've got to give it back. And that's really hard to do today. Quant or qual. It's really hard to have that traceability. So that's one of those things that, as an industry, I think it's going to be a great big deal. Now we have all this technology that allows us to capture all this data, analyze all this data. And it's all over the place, and it's ubiquitous. and now we have this whole new thing coming in that's going to say, you better be managing this correctly and you have to put in a whole – an additional system of a compliance layer to manage this in a way that we've never had to do. I don't think any industry has really had to do this before. And that is coming down the pike really fast – months. CCPA goes into effect in January. All right, Renee. Sorry. RENEE WYCKOFF: All right. We good? No, thanks, no. This was great. Guys, thanks so much for coming back once again, and sharing all this really good information. We're going to take some questions for the next few minutes, but first we'd like to thank Focus Forward, our transcription partner, who so graciously volunteers to transcribe this webinar for free. Focus Forward is just one of our technology partners that we partner with to bring a complete qualitative toolkit to our clients. We'll be emailing out the recording of this webinar, and you can view this webinar, or download a copy of the free transcript, at our website, as well as learn about all of our exciting technologies offered through L&E Solutions, at www. LEresearch.com. So, let's get started with some questions. ELIZABETH WOLLENBERG: Elizabeth here with the questions. So, how does AI work with qual? LENNY MURPHY: Charlie, want me to tackle that? CHARLIE RADER: Yeah, I mean I think the – what we're talking about, and AI is a big buzz word. So, to get really clear on what we're talking about is beyond the transcript of these video pieces, and Focus Forward, and all these firms that are working at – there is oftentimes artificial intelligence on understanding how language is put together. I mean you can see that in your phone today by when you start typing a text, it suggests the next one. Well, that's because it's looked at thousands and millions of texts previously, and if you say I'm on my [INAUDIBLE] in the same way, so there's AI at that level. But I think the AI beyond the transcript itself, and transcription just powers, is the raw material for these next level analytics. It's not the – what I'd call the automated piece of just counting words, but really looking at what humans aren't naturally going to be able to do. So, counting words isn't what I'd consider artificial intelligence, it is what are the topics? Can the computer look through the transcript and say, well, here are the topics that you are talking about. These are the keywords, the entities, that were available in this discussion. That's the artificial intelligence piece, because the other bits of frequency of keywords or word clouds, I built that ten years ago. So that's not that – and I say I, with the help of some great partners – the Three Play Technology [ph], so the fact is is that having that is what I'd call kind of one level artificial intelligence is really saying – using natural language processing to understand what the discussions, the topics, the key entities that are there in the body. LENNY MURPHY: But there's two more elements to think about. And one is the predictive ability, and you kind of mentioned that with the example of autocorrect on our words. So I think – where it would truly be interesting between business rules, and AI, is that predictive ability. But business rules play a role here as well. We usually call that automation. But that business automation framework is kind of embedded in AI, that the AI learns and adapts based upon the information being given to it, and they can predict outcomes, and that predict outcomes can then follow along a specific series of pre-defined rules on how to deal with these situations. So, it's all the way from analytics to curation to visualization, or even foresight when you get into some of the really sophisticated stuff. Look up a company called Palantir, the scene stones from Lord of the Rings. Palantir developed – they're one of the largest defense and national security contractors. And it's worth billions of dollars, it's Peter Thiel, he's a major investor there. And it's around using unstructured data. And structured data to predict outcomes. And that's utilized in AI. And we're able to do those same things at smaller scale. They use it to find bad guys for instance. Or to predict crime, which is really kind of weird. But that's – we're getting to that same place within qualitative or quantitative data. ELIZABETH WOLLENBERG: And going back to a few video questions here. Aren't some topics so sensitive and some products so private that video is not feasible? CHARLIE RADER: So, I work in feminine care – pads and tampons and adult incontinence. We're close, but there is – there are – if you put the control of video, especially in sensitive subjects, to the panelists, to the consumer themselves, and they decide what they're willing to share, I'm surprised a lot of people about what folks are willing to share. But even back to the first early days when I was starting this – we were getting toothbrushing videos, just out of the shower, at six-thirty AM. Now, that's a situation where you don't get a real person in somebody's bathroom at that level. We're getting leg shaving videos for Venus. So, I would say you probably aren't thinking big enough. There are ways to structure the study right. If you're worried about it, then take it to your IRB and have them go through that. But I think there are lots of video ways to learn from the consumer. LENNY MURPHY: Yeah, I'm trying to remember your colleague, when Sion did one of the first presentations at IIeX, and the title was, They Showed You What? And he works in fem care as well. And that was – yeah. I totally agree with you. If you don't ask, no one's ever going to say yes or no. ELIZABETH WOLLENBERG: Have you ever had trouble integrating video qual with text qual? For instance, a situation where consumers might say something that conflicts with their behavior on video? CHARLIE RADER: That's when things get interesting. I don't find that a conflict, I find that's a place to probe and understand greater. Yeah. ELIZABETH WOLLENBERG: Cool. Do you guys have a favorite tool for AI analysis of high res ethnographic video? CHARLIE RADER: There are a number of platforms that have AI and actually that's something that I've been investigating – it's always great to see what the state of the art is in this. And I think that of the platforms that are available, there are a variety of niche applications. So some does this better than that. And so I hate to say it, it depends. But I'd say there's a lot of – there are a number of platforms out there that are me-too, and not maybe cutting edge. So, those are the ones that I'm like, it depends on what you need. And there's a cost benefit analysis to be done on that. As I look at your slide of all the technology partners, and all the ones that we use as well, I see reasons for a variety, and how we would use each one of them in the different manner. There's not one that I would see on there that I'd say, no, I would never use them. LENNY MURPHY: Yeah, fit for purpose. CHARLIE RADER: Fit for purpose. Yeah. I just go on and Lenny succinctly hits that one. OK, yeah. I'll let you start the next one. ELIZABETH WOLLENBERG: We have time for just one more quick question. And this was actually more so towards the beginning. Someone is asking what you mentioned, three future technologies: augmented reality, virtual reality, and there was a third one that she missed. LENNY MURPHY: 3-D Printing. ELIZABETH WOLLENBERG: Very good. Thank you. LENNY MURPHY: For rapid prototyping and iteration. Yeah. CHARLIE RADER: And virtual is probably the most rapid prototyping there is because it's all pixels on a screen. LENNY MURPHY: Yeah. ELIZABETH WOLLENBERG: Well, we still have a number of questions here, but we want to respect everyone's time. So, I have the questions, and I know who asked them. I will get in touch with you and get your questions answered, so you won't be left hanging here. And I will hand it back to you, Renee. RENEE WYCKOFF: Yeah, so as mentioned, we will be emailing out the recording of the webinar, so you can view this webinar as well as learn about all of our other technologies offered through L&E Solutions at www. LEresearch.com. Thank you so much for joining us to today, and be on the lookout in your email for an invite to the next webinar we're hosting in the series. Have a great afternoon everyone. Thanks again for joining us. LENNY MURPHY: Thanks, everybody. All right, bye-bye.