Amazon AI Review Summaries: How to Predict What They Say and What it Means for Your Business
Sep 7, 2023 1:30 PM - 2:30 PM EST
When it comes to the shopping experience, reviews and ratings are one of the top drivers of consumer purchasing decisions. Recently, Amazon launched AI-generated review summaries, an aggregate of the most prevalent product feedback. As this tool becomes available to brands, how can you predict and analyze key themes to propel growth for your business?
With detailed descriptions, AI-powered review summaries provide qualitative information for consumers to assess and compare products, allowing for informed decision-making. Additionally, these summaries provide valuable data points brands can leverage to refine PDPs, update marketing messages and claims, improve item features, and launch new products. When performing a review analysis, separate your attributes and themes by positive, negative, and neutral to develop average percentages and frequency rates.
In this virtual event, Gautam Kanumuru and Spencer Kelty of Yogi return to chat with Tiffany Serbus-Gustaveson about optimizing Amazon’s AI-powered review summaries. Together, they address the differences between mentioned reviews and AI review summaries, how to harness AI tools for review analytics, and how to predict attributes in review summaries.
Yogi is a product sentiment platform that enables brands to gain deeper visibility into customer feedback and voice-of-customer. We are the only tool that uses proprietary AI & NLP technology with reviews & ratings as the main data source. This enables faster and more granular analyses to uncover issues, opportunities, and trends. Brands like Tylenol, Colgate, and Nestlé use Yogi to increase conversion rates on PDPs, prioritize product improvements, and find opportunities for innovation.
Connect with YogiSenior Digital Strategist at BWG Connect
Tiffany Serbus-Gustaveson is a Digital Strategist at BWG Connect, a network and knowledge sharing group of thousands of brands who collectively grow their digital knowledge base and collaborate on partner selection. With over 13 years of experience in the digital space, she has built a strong reputation for driving growth, innovation, and customer engagement across a variety of online platforms. She is passionate about keeping up with the latest industry trends and emerging technologies by speaking with hundreds of brands a year thru the BWG Network.
CEO at Yogi
Gautam Kanumuru is the Co-founder and CEO of Yogi, a product sentiment analysis platform that enables brands to gain deeper visibility into customer feedback and voice-of-customer. With a background in AI and natural language processing, he played a crucial role in developing Microsoft products, including Cortana and Xbox. Before co-founding Yogi, Gautam was the Vice President of Engineering at Clarke.ai and a Program Manager at Microsoft.
Head of Marketing at Yogi
Spencer Kelty is the Head of Marketing at Yogi, a product sentiment platform that provides brands with deep shopper sentiment insights from reviews and ratings. With a rich background in leading startup marketing teams and agency consulting, Spencer focuses on creating insight-based content for Yogi. His expertise in working with eCommerce brands and technology solutions has been instrumental in modernizing customer experiences and contributing to Yogi's growth, which serves major clients like Tylenol, Microsoft, and Nestlé.
Senior Digital Strategist at BWG Connect
Tiffany Serbus-Gustaveson is a Digital Strategist at BWG Connect, a network and knowledge sharing group of thousands of brands who collectively grow their digital knowledge base and collaborate on partner selection. With over 13 years of experience in the digital space, she has built a strong reputation for driving growth, innovation, and customer engagement across a variety of online platforms. She is passionate about keeping up with the latest industry trends and emerging technologies by speaking with hundreds of brands a year thru the BWG Network.
CEO at Yogi
Gautam Kanumuru is the Co-founder and CEO of Yogi, a product sentiment analysis platform that enables brands to gain deeper visibility into customer feedback and voice-of-customer. With a background in AI and natural language processing, he played a crucial role in developing Microsoft products, including Cortana and Xbox. Before co-founding Yogi, Gautam was the Vice President of Engineering at Clarke.ai and a Program Manager at Microsoft.
Head of Marketing at Yogi
Spencer Kelty is the Head of Marketing at Yogi, a product sentiment platform that provides brands with deep shopper sentiment insights from reviews and ratings. With a rich background in leading startup marketing teams and agency consulting, Spencer focuses on creating insight-based content for Yogi. His expertise in working with eCommerce brands and technology solutions has been instrumental in modernizing customer experiences and contributing to Yogi's growth, which serves major clients like Tylenol, Microsoft, and Nestlé.
Senior Digital Strategist at BWG Connect
BWG Connect provides executive strategy & networking sessions that help brands from any industry with their overall business planning and execution.
Senior Digital Strategist Tiffany Serbus-Gustaveson runs the group & connects with dozens of brand executives every week, always for free.
Tiffany Serbus-Gustaveson 0:18
Thank you all for joining. Happy Thursday everybody. I am Tiffany Serbus-Gustafson, a digital strategist with BWG Connect, and we are a network and knowledge sharing group. So we stay on top of latest trends challenges whatever is shaping the digital landscape, we want to know and talk about it. But I'm trying to do at least 500 of these virtual events this year due to the increase in demand to better understand the digital space. We'll also be doing at least 100 in-person small format dinners. So if you happen to live in a tier one city in the US, feel free to send us an email, we'd love to send you an invite, the dinners are typically 15 to 20 people having a certain discussion around a digital topic and we always have a fantastic time. We spend the majority of our time talking to brands is how we stay on top the latest trends would love to have a conversation with you. So feel free to drop me a line at Tiffany@bwgconnect.com And we can get some time on the calendar. It's from these conversations we generate the topic ideas we know that people want to learn about and it's also where we gain a resident experts such as Yogi who's with us today. Anybody that we asked to teach the collective community has come highly recommended from multiple brands. So if you're ever in need of any recommendations within the digital space, please don't hesitate to reach out we have a shortlist of the best of the best. And we'd love to provide that information to you. Also note if you have any hiring needs we do partner with the talent agency Hawkeye Search formerly BWG Talent that I can put you in contact with as well. A few housekeeping items. First and foremost, we want this to be fun, educational, conversational. So drop as many questions, comments you have into the chat the q&a, we'll be sure to get to them. If you feel more comfortable, you can always email me at Tiffany@bwgconnect.com. And I'll be sure that we get to them as well. We started about three or four minutes after the hour. So rest assured, we're gonna wrap up about 510 minutes before the end of the hour to give you time to get to your next destination spot. So with that, let's rock and roll and start to learn about Amazon AI review summaries how to predict what they say and what they mean for your business. The team at Yogi have been awesome partners in the networks. I'm gonna kick it over to our panelists, and then we can dive into the information. And he has an Introduce yourself. That'd be lovely. Thank you so much. Awesome.
Gautam Kanumuru 2:23
Thanks so much Tiffany, for that introduction. And thanks, everyone, for for joining on what I'm sure is a busy Thursday, coming out of Labor Day and the summer break. But Nice to meet everyone. My name is Gautam, I'm co founder and CEO of Yogi. Prior to working at Yogi, I've kind of been in the AI space for a very long time, maybe before it was cool. So I used to work at Microsoft on Cortana Cortana and Windows Cortana. On Xbox. And after Microsoft was VP of Engineering at a company called clarke.ai that was focused on essentially summarizing meetings and various conversations. So again, in the natural language processing and AI space. So if obviously, a lot of the work that's been happening around chat GPT, and AI in general has been very exciting just to be be part of that wave. And yeah, I was also kind of lucky enough to be named on the Forbes 30 under 30 list for enterprise software and AI, but I can pass it to Spencer to introduce himself.
Tiffany Serbus-Gustaveson 3:36
So congrats on that. Very cool. Thank you. All right, I've sponsored the way I love this photo.
Spencer Kelty 3:45
It's definitely, definitely a good conversation starter. Well, I'm Spencer, I'm head of marketing a Yogi done with a team for about a year now. I've been working in the software space serving the eCommerce space for a while I was head of marketing over at constructor before I came here. And before that I was an agency strategist working with quite a few software companies and eCommerce companies. So I'm really focused on finding out new ways to use tech to improve that customer journey and experience. I think as a marketer, that's really what we should be focused on is understanding how customers interact with brands and products and figure out how to make that a smoother, smoother experience. So with that, let's dive in.
Tiffany Serbus-Gustaveson 4:30
Let's do it. This is the hot topic of the year. I mean, we do a lot of events and AI is top of mind for everybody's That's awesome.
Spencer Kelty 4:39
Yeah, for sure. And you know, it's funny, we we actually did a webinar on AI summaries with BWG A few months ago, but I feel like back in July when we had that it was a topic that nobody was was really talking about. There wasn't much buzz, it was something that was coming up rather than something that was on people's minds then So much has changed. In those few months, we've gotten so much more in the way of development and stuff rolling out. And we're actually seeing people talk about it and consider it now. So really, what we're here to talk about today is you know what this new AI first Amazon shopping experience looks like. How to predict what the contents of an AI Review Summary might be using review analysis. And then just preparing in general for how these new AI first shopping experiences are going to change eCommerce, and how they might impact your brand. And what you can you to kind of get ahead of that. So let's dive right in with AI first, Amazon and beyond.
Gautam Kanumuru 5:46
Let's let's do it. So just to make sure kind of set the stage, if you will. Obviously, Amazon is a very dynamic company. selling on Amazon is a very dynamic space, it feels like there's a new page probably every day, every hour, AV tests all over the place. But one thing that Amazon has been testing since closer to the beginning of the year, kind of the q1 timeframe is this concept called AI review summaries or AI generated review summaries. And so essentially what these are are a generative AI summary that takes a handful of the most common positive themes, a handful of the most negative themes about a product based on what's being talked about in the reviews, and then creating sort of a succinct summary of what is being talked about. So these summaries appear directly above the review section. So if you scroll down to the bottom of the PDP right above, where you get a summary of what the average rating is, and your your about, it'll kind of give you this, this breakdown. And the early tests really revolved around, we started to see them a lot on electronics or sort of small appliance based products. But since then, they've started to be rolled out across hundreds of products across a variety of categories. Right now, these are really showing up on mobile. But Amazon has already kind of both put a blog post out about this. And the intention is definitely for this to be rolled out on the like the website experience as well, in the near term, and probably across all products that that are available on Amazon. And so this is a very maybe monumental isn't the right word to use, but a very drastic change. From a shopping experience standpoint. When it comes to reviews and ratings. I'm sure everybody here is familiar with just how important reviews and ratings are. When it comes to consumer decision making how big of a difference just a point one increase or decrease in your star rating, how big of an impact it can have on on sales for for your product. But I think it's also worth taking a step back and really understanding why these reviews and ratings are so influential at the point of purchase and why Amazon is making a move like this. Right What at the end of the day, us as as purchasers just putting aside the fact that we work with their for kind of consumer brands, your goal is to Best Buy the best product possible, right or get the best value possible. So as much information as you can gather in a very clear way about whether this product is good or not worth the money or not or will suffice for your needs is very, very important. And nothing beats kind of word of mouth feedback at the end of the day, right? When if my neighbor my friend tells me Hey, this is the product that I was using to get the weeds out of my lawn and it worked really well for me, I'm going to lean into that for sure. And probably by the same product reviews and ratings are almost the next. A good replacement for that when you maybe don't have that friend giving you the recommendation but at least some some core information. So now where the review summaries come into play is what these shoppers used to have access to before the at least the quantitative value right and deciding between two toothpastes. This one has a 4.4 the other one has a 4.2 I would guess the 4.41 is a little bit better. But truly understanding why some are rated better than others and what they're good at versus not good at would be a semi manual process for a lot of the shoppers they would have to go in probably read 1015 20 reviews just to get a sense of what's what's going on. And so what Amazon is really trying to do is remove that tension from the shopping experience. Right? Give you an even faster way for you to purchase actually understand what's going on with this product. Or maybe even just compare two products and decide which one you're going with. So it's going beyond that simple star rating. And really helping you get that understanding of the good and bad elements about a product. Until at the end of the day, really what you're doing is you're, you're quickly getting this high level, but relatively thorough understanding of what other consumers think of these key key features. And so if we, if you just look on the breakdown on the left, I think one of the things that's very interesting or powerful about this is, it's not just the summary, but also breaking it down on a theme level, right? So in this case, for the this television, if color quality was really, really what's important to me, I'm looking for that green checkmark. And I only want to focus on products that have a green checkmark there, for example. So it really allows me to get more personalized from a shopping perspective. And I think that leads to sort of where we see the world going from an overall shopping experience giving these new sort of AI powered experiences. And so if, if you can just jump to the next slide, Spencer, thank you I'm really where we see this going is AI review summaries are just the beginning, AI is going to start to become a much more central part of the buying process. And I think a really interesting example of this is Google has their new board system, which is their competitors sort of chat GPT. And on beta version two bar, if you play around with it, they've really honed in on the shopping experience side of things or shopping like queries that go into Google. So this is this a search for a good bike for a five mile commute with hills. And what Bart is able to do is again, using generative AI and understanding what customers are talking about from a feedback perspective, is essentially building what we like to call buyer guides. So this is no longer just, hey, here's a list of all mountain bikes that you can shop from it's Hey, since you're going since you're focused on hills, here are a few themes that you should be really be focusing on, like what is the design? If you're going with an electric bike, what kind of motor and battery is it running? How's the suspension since you're dealing with hills, and then a breakdown of, hey, if you're looking for the best one for hill climbing, here's the best option. And here's why. If you're looking at one that's more casual for computing, here's the best option. And here's why. And so you can start to see how this becomes a really hyper personalized experience from from a shopping perspective. So this is not only going to affect sort of the PDP and what consumers see there, but also starting to get into the search experience as well and how users discover products, as well as their ability to start to compare products as well, right? If you think back to what the AI review summaries look like, it's very easy for me to say like, Hey, I only care about TVs that have an amazing experience from a color quality perspective. Now within those show me the ones that I should care about not care about what's the best price and value for me there. So it's a very dynamic, very moving Speight. Like fast moving space. But I think the thing to just always keep in mind with this is that this is removing a core tension for shoppers. And that's why this is going to be a sticky experience, right? I'm sure we've all had that experience where, hey, I need to figure out the next TV I'm going to buy, for example, and there's 1000 different options there. There's a bunch of different is it 1080 P 1080. I like small bezel you OLED Q LED. What all of these mean, there's that analysis paralysis that you go into that this is this experience is really bad to remove, which is definitely something that shoppers experience and definitely something that this experience really removes for people.
Spencer Kelty 13:55
Yeah, yeah, I want to jump in really quickly. And just say to you that this, this isn't something that's just been a be tested with a few people right now, this stuff's rolling out relatively quickly. In the case of the Amazon review summaries, there, there seems to be account based AV testing. For example, I can't seem no matter what platform I use, what device I use my accounts, it's just not coming up on it. But we took a internal poll and roughly 30% It seems of accounts have it activated. So there's definitely internal AV testing where they're trying to figure out what products and what users it works on and how it impacts things. For this Google example, this is relatively relatively recent, though, but I've actually started to see these types of results come up these shopper guides when just a few weeks ago, they seem to be somewhat theoretical. If you are part of the the Google I can't remember off the top of my head what they're calling it but the Google lab tests seem basically there's a little beaker icon that'll pop up next year search bar that you can, you can basically turn it on or off. And if you have that on in some searches these buyer guides appear today. For example, the other day, I was researching some accessories for my espresso machine. And one of the searches resulted in a kind of shorter version of this buying guide, which showed different options, different different pros and cons on these items, exactly how we're seeing this example on screen right now with the commuter bike. So I think it was interesting that what we're seeing here are options that cost 1000s or hundreds of dollars. But what I got was something that cost well under $100, that already had a simplified buying guide. So this is definitely something that is going to start impacting most every purchase oriented search you're doing at some point, it's just a matter of time.
Tiffany Serbus-Gustaveson 16:01
Oh, interesting questions, comments, put them into the chat, and we will get to them. So we know what the consumer is getting out of this obviously is saving a lot of time less friction. What else like Amazon, for example, I believe in the last earnings call the CEO Andia. JC was saying every team within Amazon right now is working on AI projects. And so just that statement alone is like whoa, you know, like you said they're coming in hot with this, this isn't going anywhere. What are they trying to achieve? Meaning Amazon with review salaries in particular, this is less friction for the consumer,
Spencer Kelty 16:41
I can take this one Gautam. So as I mentioned earlier, I worked with a company called that handled AI search for eCommerce companies. And having those conversations with a ton of different retailers over over the time I was there, it's it really comes down to a couple of metrics for literally every eCommerce site you experience, you're looking at cart size, you're looking at time to purchase. And you're looking at repeat customers. As well as certain secondary metrics like you know, lowering your abandoned cart ratio and stuff like that. Whenever you talk about something like this, that creates more dynamic experiences, they're hoping that it's going to lead to people making faster decisions and better decisions. So faster. Obviously, if you have something that's a high research topic, where people would normally be spending a ton of time looking through reviews, the hope is, is that there'll be enough trust in these review summaries that people don't spend all that time searching through reviews anymore, they just look at the summary. I would compare that a lot to how on certain higher priced items, electronics, things like that Amazon already has comparison grids on the products for similar items that people purchased. It's just part of that evolution of trying to get more customized feedback to people on what they're going to like or dislike about the product potentially. And the other part that I see because I I've heard retailers talking about the other side of it is matching people to experiences that are going to work well for them. So Gautam a little bit ago highlighted those boxes underneath the review summary that either has a checkmark and in some cases, it has an x if it's a bad experience, or a minus if it's a mixed experience. But those are very clearly attribute based models that they're working with. And you better believe they're working on the other side of that, where they're looking at the attributes that individual consumers care about in the reviews they leave and the experiences they have. So I think that part of that, too, is looking to match people with experiences that are going to lead to positive outcomes.
Tiffany Serbus-Gustaveson 18:56
super interesting. Can you please explain how this differs from reviews that mentioned? Yeah, yeah,
Gautam Kanumuru 19:04
I can, I can jump into that. Pretty much. This is almost the difference, I would say between high level analysis versus granular analysis at the end of the day. So the reviews that mentioned is very much maybe first of all talking about the technical side. And then the, like the implication sides on the technical side, reviews that mentioned is very much like sort of a keywords based type thing, like hey, there are certain keywords that we're seeing a high frequency of or high frequency and five star reviews. And so Amazon would just kind of sort of bubble that up. That's not necessarily AI. It's more just sort of counting word that and stuff like that, but it sort of gave directional the idea of it was to solve the same problem, but I think from a little bit more of a raw or unpolished way. With these AI review summaries. These are almost as if it's again using genuine of AI, the same stuff behind sort of chat TPP. And bar. This is actually given an almost like a human level of visibility into the perspective. So at the end of the day, when you look at reviews that mention or reviews with certain keywords, those are at the end of the day at the word level, right? If you look at AI review summaries, they're going to sit at sort of the theme level. So if you're looking at let's just talk about picture quality. For example, people could be using the word vivid, they could be using the word doll, they could be using the word pixels or, or how like clear things like that all of that can generally fall under the umbrella of color quality, or maybe its picture quality as a whole. And so what these AI review summaries are sitting at, they're really sitting in that theme level, which is much easier for humans to interpret, right, it's much easier for us to just go, Oh, hey, this is great from a picture quality perspective, maybe not that great from a sound quality perspective, versus having to pick out the individual keywords that fall under each bucket. Now, both of these, the goal of them at the end of the day is to solve the same problem, which is to give shoppers the fastest and easiest way for them to understand what is good and bad about this product, versus what the manufacturer is telling me like based on people's experience, what is good and bad about this product. So in a lot of ways, we expected, we haven't gotten like the full quantitative data on it, we're still doing some analysis, but reviews that mention we think is going to start to get sunsetted behind this experience, because it definitely is a level up
Tiffany Serbus-Gustaveson 21:42
as a whole. Makes sense. Yeah, I
Spencer Kelty 21:45
would just really quick add on to that reviews that mentioned is essentially just replacing somebody Ctrl F, you know, using Ctrl F, that's that's basically what it's doing. And that's not really a big value add to consumers, it's not changing their experience dramatically. If they wanted to do that they could do that with two, two button presses already. It's also, you know, it's important to remember too, that that still is having people look through physical reviews that are left by complex people that have complex things to say it doesn't distill it down to data, which is ultimately what review summaries are. It's taking into context how often things happen. You know, if this is a consistent or common theme, when if you're just looking at reviews that mentioned it doesn't give you any context. You know, if you look at reviews that mentioned something you don't know if that's 1% or 10% of reviews that are mentioning it, you just know that it's showing you those reviews. So there's a lot of contextual information that is going on in the background of something like review summaries versus reviews that mentioned.
Tiffany Serbus-Gustaveson 22:50
Super helpful, thank you.
Spencer Kelty 22:55
All right, so that takes us to kind of the next part of this presentation. And that is predicting Amazon review summaries. I think that for brands right now, there's kind of two categories. There's those who have already seen review summaries on some of their products or their competitors products, and they kind of have an idea of what's going to happen. And then there's the ones that have no idea because it hasn't come to their category at our hasn't come to their products yet. And they're trying to figure out what this is going to look like for them. So how I want to start with this is looking at review summaries, understanding the patterns and the general form and structure that they hold. So we can start to understand what they're going to look like and how to predict that. So first thing we're gonna do is look at the anatomy of review summaries. These are generative. So they use some, some form of structure in them, they're they're not completely freeform, they're very consistent in the way that they're structured, to a certain extent, at least. So the first thing is, generally they all open with a key positive attribute. This is going to be the thing that stands out the most with very high sentiment in reviews that people are talking about a lot that comes up as a key attribute that people really care about. So in this case, these are some Sony headphones, its appearance, people are talking a lot about how good these look. Now, you'll see that there's a lot of real estate given to this. It's three full sentences about the appearance and the design of them. So that's basically 1/3 of the entire review summary. It's just about this one thing. The next thing is some secondary positive attributes. So we can kind of see these, these in a list form. Sometimes you'll see them literally just as the attributes mentioned, you know, attribute comma attribute comma in a list. Sometimes you'll see a sentence where they're worked in. In this case, it's a sentence The sound quality is better than expected in the head. VLANs are comfortable. These are things that appear frequently as positive sentiment in reviews. But maybe they're not getting quite as much attention or as much frequency as that key leading attribute is, then we get down to the last third of the review of the review summary, which is going to start with your key negative attribute. This, again, is pretty consistent. Most of these review summaries have a key negative attribute that they focus on. This is something that clearly appears very frequently in negative reviews, and seems to correlate to the thing with the most negative sentiment that appears in those reviews. In this case, we're really seeing stability of the earbuds which is interesting because stability. That sounds a little bit like an AI interpreting something not so much as a word people would use. What it seems to talk about is that the headphones probably coming out, not fitting right, not being very stable in ear. It's an interesting example, because stability is probably not the word most people would use in their natural reviews, but it's the way the AI is interpreted those. And then lastly, we're going to see kind of mixed or negative sentiment attributes in a list. Similar to how we saw the positive secondary ones listed, we're going to see the negative or the mixed attributes. In this one, you'll see that it's presenting these as as mixed mixed opinions around battery life quality, sound quality, sound, quality, performance, fit and comfort, they actually list a ton of attributes, which matches pretty well with this being a 3.6 out of five rated product, there's a lot of mixed opinions going on here. So they list quite a few. In some products, you're not going to see any mixed attributes, you'll just see a couple of negatives listed here. But basically, this is going to be the tear of attributes that have better sentiment than the highlighted negative one, but definitely not positive. Now the last thing I just want to call out here on the overall anatomy of how these work is generally speaking twice as much time is being spent on positive attributes than negative or neutral. This is very consistent across all of these, they're not spending a whole lot of time on the negative, they want to highlight a couple of the really good things about the product, and then quickly give you an overview of what's not so good. So the thing here is, you know, obviously, the rating of the product, and what actually is going on in their reviews is going to determine if most of the, the non positive space is neutral or negative. That's that's a variable that can go either way. But you're always going to see that top, you know, 60% 70%, the very positive
Tiffany Serbus-Gustaveson 27:58
and the negative, are you seeing that it really is gearing towards the product only and being able to funnel out anything that has to do with delivery or damages?
Spencer Kelty 28:08
That's a great question. That's a great question. Yes. I have not yet seen any that mention anything not to do with the products. I've seen one that mentioned packaging. And I think that that is probably the furthest away from the actual product use I've ever seen. Go from if you have anything to add there, I'd love to hear it. But in my experience, I've seen nothing to do with shipping, nothing to do with fulfillment, nothing to do with customer support anything like that.
Gautam Kanumuru 28:41
Yeah, yeah, at the end of the day, they want the review summary to be focused on the actual product. So there's definitely purposeful filtering out from the Amazon experience side of things. In these in these review summaries. We've definitely seen that. And I know before you jump into the predicting piece, I know I think we got a question from Sam about the organic reviews side of things. So we haven't seen a reduction in volume from organic reviews on at least the ones that we've seen with review summaries. I think there's sort of two reasons why we don't think this will be a big thing, even though it actually makes sense. The first one is the thread of sort of consuming reviews and writing reviews are are somewhat separate. So just because you see reviews, summaries, it doesn't necessarily discourage you from if you've already a person that would write reviews to continue to write reviews. I think the second thing to also keep in mind with this is just it's an aligning of incentives. Amazon continues to need a free flow of reviews coming in. Yes vine may increase in importance and that's definitely a revenue stream for them but organic reviews are still the massive Out of this, and they still need a flow of these organic reviews in order for these review summaries to stay fresh and not essentially become stale. So there will be processes that are Amazon is definitely looking at making sure that that flow of organic reviews continues to come in. And we'll continue to make sure that that's the top priority kind of from from that perspective. And then, and actually one thing worth noting from that we're still gathering full data on this. But we're seeing these AI review summaries, it's somewhat changes product or product, but they're updating anywhere from a daily basis to sort of a weekly or bi weekly basis. So these are, these are meant to move very fast, there's definitely a waiting on more recent reviews for this. And so that is also something to keep in mind is they don't want this to be a thing that only gets updated once a year or twice a year. This is definitely a free flowing thing. And again, because it's aI generated, it's just a scale problem for them more than a human effort problem. And then from the theme level recommendations on avoiding the trap of getting pigeon holed in sort of a echo chamber of summary, summary themes. There is some some information that that we'll dive into on kind of future slides, but just to give you a very quick preview on it. At the end of the day, what our experience has shown in this space, and we've been in the space of analyzing reviews for a very long time is what you put an emphasize in the PDP shows up in reviews. So if you're really focused if you're making bodywash, for example, and you're really focused on the scent, and that's really what you're emphasizing. larger majority of the your reviews are going to talk about sent. If you're talking about the texture, or how much foam or bubbles come up, you're going to see a disproportionate number of reviews talking about that as well. So there is sort of the self fulfilling thing of what you emphasize from your marketing messaging, will show up in reviews downstream, which if you're doing well at them will show up positively in reviews, and then it becomes the cycle. So yeah, if you don't control the narrative, there is a chance that you just get pigeonholed into something and you continue to do that if you don't change. Or if you can try to take control of the narrative and really put it around not only what you focus on, but what your product is honestly, good at. Obviously, we would all believe or hope that our product is the best at everything and the best for everyone. But just being realistic about that view definitely makes a difference downstream and will show up in these review summers. Hmm,
Tiffany Serbus-Gustaveson 32:43
oh, ballistic. Very cool.
Spencer Kelty 32:46
And that's that's a great point, Gautam. Because review summaries are a lot more a part of the PDP than reviews are as something that's, that's going to be relatively static in its placement, it's going to appear head of reviews. So I think that that feedback loop between what you put in your PDP and what appears in the reviews, obviously, and what appears in the reviews is what appears in your review summaries. I think that's going to become even more clear for a lot of brands.
Tiffany Serbus-Gustaveson 33:14
So it's really becoming that bridge between the product detail page. And then the review process where in the past was just Oh, post purchase. Here's a review. Last Exactly. Very, very cool. All right,
Spencer Kelty 33:28
well, let's look at kind of taking those building blocks that we looked at in the last slide and understanding how to find those. So the first thing to do is to figure out what your products core attributes or themes are that are appearing inside your reviews. So this is really going to be about sorting through your views by most recent going through them by hand and pulling out the attributes or themes that are mentioned, and recording the frequency of the mentions and whether they're positive or negative. Basically, what we're trying to do here is is create kind of a map of what things are coming up and reviews the most often. And whether they are contributing positively or pulling you down negatively. Once you have that, you're going to start to build a kind of a rough sentiment theme around each of these and determine what percentage of the mentions of that attribute are positive. This is really going to let you understand which ones are contributing positively to your reviews, which ones are pulling you down negatively. And then from there, you're going to want to sort these into buckets. So remember from our example in the last slide, there are basically four categories of attributes that appear in these review summaries. There's your standout positive, so we want to see if there is a significantly higher sentiment theme or attribute than anything else, something that stands out by, you know, I would say there's no rule of thumb here, but probably looking for something, it's at least 10%. more positive than the other things, if there's something that really stands out, that's definitely likely to be your standout positive attribute that gets that two or three sentence feature at the start. On the same level, if there's a negative attribute or theme that is significantly lower sentiment, or if not significantly lower sentiment, significantly higher frequency and very low sentiment, that is likely to be your standout negative attributes, then you're going to look and you're going to pull the ones that are kind of in the middle, which ones are kind of the runners up positive, those ones are going to appear in that list section, both positive and then the mixed or negative. So from there, you relatively quickly are able to find those four building blocks that really make up pretty much every Amazon review summary you're going to find. Generally speaking, the summaries will have between seven and 10. Attributes mentioned, I think the lowest I've seen with six, but it usually is somewhere around that like eight or nine range. Generally, you're going to see in lower rated products, more of an even split between positive and mixed or negative attributes. If it's a well rated product, you're generally going to see about two thirds 1/3 leaning towards the positive. So you should just naturally find this out. As you start looking through these attributes and figuring out which ones come from the most frequently, you'll probably be able to look through, you know, 5060 reviews, and find while there are eight attributes that are coming up, and you'll probably see that there's a relatively correlated your star rating on what that ratio is, if you have a four and a half star rated products in the top 10, you might find two that are mostly negative. But if you have a two and a half star rated product, you might find that of your top 10, six or seven are negative or mixed. So it's really going to be somewhat eye opening to see what those attributes are. And I'm sure that most of them aren't going to be a surprise. But what might be a surprise is putting that numerical value of you know what percentage of them are negative or positive. And mapping it out to that. That anatomy of the summary that we talked about before to understand what's really going to get highlighted here, what's the attention going to be put on. And one thing to note here, back to go through this point of the PDP kind of feeding back into reviews that can really quickly tell you if your PDP is working or not. If you see that your your most frequent mentions are negative. And they're things that are mentioned in your PDP, well, maybe it's time to rethink how your product is positioned on the PDP. So all of that to say, that's pretty basic, pretty basic review analysis that you just conducted. We talk a lot about getting into your reviews and building some strategy out of that finding how to do quick manual review analysis to get some basic answers and kind of dip your toes into the review analysis world. You know, obviously, anytime you're pulling those reviews in and trying to find data, you're conducting that analysis. So I'll kick it back over to Gautam as I feel like I've been talking for a long time now.
Gautam Kanumuru 38:37
No, no problem at all. Yeah, I mean, at the end of the day, I think really, what it is, is reviews and ratings are a highly unique data sources. And it's because of this uniqueness that they've become so influential, not only at the point of purchase, as become the thing that has to show up on almost every eCommerce website, whether it's direct to consumer or retailer or a brand's website, and why Amazon is sort of double Downing, double downing are on it and painting the experience around it. At the end of the day. There's a lot of attributes to this data source, right. It's on demand and available as needed. So it's constantly updating because products don't stay still experiences don't stay still consumer preferences always shifting. We've already talked about how there's a high level of decision impact related to them. They're, they're public and brand defining. And I think the the easier way to think about this, they're easily accessible to people, right? It's not hard for folks to find reviews, it's not hard to understand the one to five star rating. So there's a low barrier to entry to making use of this data source and tied to that it's a direct line to how real customers feel. And then jumping to the The last point that I think is really, really interesting as well for both consumers and for brands is that it's also easy to benchmark against competing products, right, the information that's available for your products on Amazon or is also available for your competitor products. So really, really being able to understand I think, the here, just sort of like drive the example home here, here's a an example that we expect to see more often with review summaries becoming more popular, which is, hey, maybe your brand's product has a 4.2, your competitors has a 4.4. And because of that, more often than not, people might be purchasing the 4.4 star rated option. But when you dive into the reviews, what it actually turns out is that your product doesn't say it's a toothpaste product, it may not taste as well, but it's much better at whitening your teeth, let's just say users are going to be able to realize that a lot faster with this new experience. And those that care a little bit more about how white their teeth are, will actually come back to purchasing your product versus going for the 4.4 Star Wars. And so you being able to understand that and making decisions on marketing messaging and PDP changes, and even down the road to product changes. And product innovation is very, very useful, and can lead to a lot of downstream value. But just looking at it from a consumer perspective as well, their ability to easily benchmark against various options they're choosing from it's going to become much more easier and much more personalized to their experience.
Tiffany Serbus-Gustaveson 41:40
Yeah, they like it ties into that personalization. Everybody wants more of that personalized journey for their, their customer. And that's a great example of how to do it. Yep, yep, for sure. Go for it Spencer. Oh, you're muted. Helps if I'm not muted.
Spencer Kelty 42:02
The next question here is How can we apply this review data in in practice? You know, it's it's great to know that it's there and know that it's valuable. But what do we do with it next. I mean, as we covered before, reviews are often a brand's best form. And the very least most accessible form of consumer data, but also competitor data. So start thinking about reviews as a strategic resource. And I would just say if you need evidence of this, I think that the focus on Amazon review summaries and Google buying guides that are driven by reviews is such amazing evidence of that, you know, seeing that these massive companies that produce so many eCommerce sales are that focused on how reviews contribute to the buying experience means that brands should be equally as focused on that, not just as something that happens to drive sales, but something that can create that loop, where you get that information back from your customers. Use that to inform your PDP, inform your marketing, inform your messaging decisions, and then continue that evolution process to drive better reviews and make better changes. So starting to think strategically about that, not as something that ends once you generate the review is definitely the first step. After that, you really want to get into figuring out how to start doing review analysis. Now, what we talked about a few slides ago, is a great first review analysis experiment, that's not going to take you a whole lot of time to go through 50 reviews and pull out the themes, you might be able to do that in, you know, three, four hours, and figure out kind of, in general, what your products Amazon review summary might look like. But well, does that really drive a lot of impact? Unless you do it for other products? You want to see what your competitors are looking like to what are their summaries gonna look like? How are the things that their summaries are gonna highlight different than yours? What might drive a purchase decision for your product or their products? Understanding that context is kind of how you start building a strategy out of these individual review analyses, actions. manual analysis can answer a ton of questions you can answer what type of situations are coming up most commonly negative reviews? What's driving your one star reviews? What is the biggest factors of five star reviews compared to middling reviews, things like that you can answer those types of questions very quickly with the review analysis. And if you're looking at recent reviews, you can do that manually without having to take you know entire weeks of research and data analysis. But that leads me to kind of the next stage of growing and applying your few data. Oh, we have an error I am sorry. So what this is supposed to say is and this is what I get for making last minute changes to the deck It is that the next stage is AI analysis. So just like Amazon is using AI to create data and drive consumer decisions with reviews, using AI, you as a brand can do that as well, you can pull all of your data in using an AI analysis tool like Yogi, and understand exactly what consumers are saying and how they feel about every single product, attribute and theme. This is really that next level, because when you're talking about manual analysis, it's very time intensive, it's gonna take you hours to dig through data. And you're ultimately going to have to establish connections, and you're probably only going to be able to do one layer of data analysis yourself, you're going to be able to find in our last 100 reviews, what percentage of them mentioned shipping problems, but you're not going to be able to without spending a ridiculous amount of time, understand how that's changed over time? How is that different than the previous 200? Reviews? How is that different than the competitive landscape? Are our products, packaging or taste, performing better or worse than the primary competitors? Has that changed in the last two years? So by adding in an AI analysis tool, you're able to really turn this into something that's deep, can give you great answers to complex questions and show you how it's changing based on factors like time retailer that the purchase was made at even geographical location,
Gautam Kanumuru 46:37
things like that. Awesome. Now, yeah. I know, we had one question from Jose on the number of reviews before an Amazon summary kicks in. What we've seen is, one important thing to note is review summaries are obviously based on the written reviews, right? If you're familiar with Amazon, they have star rating, all the reviews are just gonna put a one to five star rating without any comments, versus the actual written reviews. At the end of the day, the generative AI algorithm requires texts in order to be able to summarize things. We've seen review summaries show up. For products honestly, lower than 50. written reviews, I think the smallest one that we've seen is about 30 written reviews, a summary was already kind of created for it. So obviously, that isn't a ton of data. But I think it shows that one, those early reviews are just the importance of them has grown even more, if anything, because that's going to establish the baseline for what your review summary is going to look like. But also this fact that Amazon is putting a low bar to having these kick in because they want it to show up for as many products as possible. So definitely want to be cognizant of time. So just kind of jumping through this quickly. Using review data. I think we've already talked about the PDP changes and sort of marketing messaging changes that can draw from, from the analysis that you can do from from review data. And I think the point just to make sure to reinforce around that again, is we see that what you emphasize your PDPs does end up showing in reviews. And because review summaries are based on what people are writing about, especially in the most recent reviews, it will show up in your review summary. So there is an ability for you maybe it's not direct, maybe it's a little bit indirect, but you are able to control somewhat the narrative that will show up in your review summaries. As long as you're being realistic about what is your product good at what it Who is it for, versus potentially trying to push a claim that may not be true, because that will work for you in the reverse way, showing up as much more of a negative within your review summaries. But I think it's important to make sure to also emphasize that it doesn't just have to stop that PDP changes or marketing messaging changes. We have customers that have used it for marketing claims improvements, who who started to realize, hey, the trend is that people are caring more about texture and less about smell, for example. And so let's see how we can emphasize texture a lot more and what we're talking about all the way up to actually making product changes or coming up with entirely new innovative products. When you can start to see gaps in the market. You see not only you but also a few competitors, maybe following falling short on a specific complaint or a specific attribute that a lot of people are talking about and that they're starting to talk about more. Because at the end of the day, it's always important to take a step back and understand that these reviews to a strong extent are unfiltered feedback from actual shoppers and purchasers. And so the closer and faster and better you can get at listening to them. And putting those changes throughout your org, whether it's eCommerce, whether it's marketing, whether it's product will make a difference downstream. We have data from clients that show star rating boosts. But I would almost say star rating boosts are the end results that are more highlighting a better product and a better fit for customers versus an artificial inflation, if you will. At the end of the day, I think there's always the fundamental piece right, better better products will result in more sales, if you are the best product on the market for certain people. There you will rise to the top at the end of the day.
Spencer Kelty 50:53
Yeah, just just really quick to drive that point home and emphasis a little bit more. I think the key there is the right product for the right person. Like like Gautam mentioned, and I mentioned earlier, that PDP to review, loop, if you're using reviews to make your changes to your PDP and understand exactly what consumers are really experiencing in that unfiltered feedback, that's, that's where you're going to be creating a better product to consumer match the right people are going to be buying your product that people that care about the experiences and attributes that your product excels at. And that goes to to what Gautam mentioned about this isn't a artificial bump to your reviews, this isn't going and buying reviews, which might also increase your star rating. This is creating happier customers by making sure that the right people are buying your product that are going to have a good experience. And that definitely will trickle back down into your reviews and show that they're having better customer experiences. They're happier, they're more likely to be repeat customers.
Tiffany Serbus-Gustaveson 51:56
Let amazing data for product development teams and quality control. No longer need focus groups. Here you go. It's right there on a platter for you. Faster to market. Amazing. We got a few minutes left. Did we want to go through any more questions? Beautiful segue. I like it, Spencer. Couple questions that we had been. And then we are done. So well. How is Yogi different?
Gautam Kanumuru 52:23
Yeah, yeah, no, I think that that's a great question. Um, at the end of the day, there's there's two sides to the review story. There's review aggregation or review gathering. And then there's review analysis. So is obviously the major player in that first bucket helping you gather reviews, host them on your website, syndicate them to other retailers, they have their promotional systems through Influenster to help you generate promotional reviews, they do have a thin layer on top for analyzing the reviews. But that is where our focus is on 100%. So we're all about ingesting reviews, not just from but other platforms like power reviews, and Yato pulling in competitor data pulling in data from Amazon, which doesn't have, and then doing a sort of a rich AI analysis on top of it to really give you that in depth understanding. So we have partners that that we were partners with to help generate more reviews and promotional reviews and stuff like that. But what we're focused on is on the analysis piece,
Spencer Kelty 53:37
just to just to be a marketer here for a second. I think that Gautam mentioned something that's extremely important. If you're here today, it's because you care about Amazon data. And just doesn't do anything with the Amazon data. So I think that that if I had to highlight the the big takeaway, it's that if you care about data and using that to make decisions, especially from Amazon just isn't going to help you on that front.
Tiffany Serbus-Gustaveson 54:04
Yeah. Awesome. Takeaway. And will you be sharing the presentation? Yeah, absolutely.
Spencer Kelty 54:12
You can expect to get the presentation and your recording as soon as possible after it might take a couple of days to
Tiffany Serbus-Gustaveson 54:19
process but yeah, perfect. Awesome. Gautam. Spencer, you always bring so much awesome Intel and expertise. We love it. So thank you so much for the time. For all the takeaways. thank everybody who has joined us We always appreciate everybody that said our network and coming to our events. We hope to see you on future events and we definitely encourage follow up conversation with the Yogi team. So with that it's a wrap. Have a lovely Thursday and upcoming weekend and hope to see you guys again. Take care.
Spencer Kelty 54:48
Thanks, Tiffany. Thank you, everyone.