Tuesday, 27 December 2022

They're Simply the Best: The Top 25 Moz Blogs of 2022

We published 156 posts on the Moz Blog this year, and as is tradition, it's time to look back at the most popular ones! You’ll find blogs on new findings in social media search, tips for e-commerce SEO, trends in technical and local SEO, and so much more.

Have a safe and happy new year, Moz friends! See you in 2023.

*The top 25 written posts published between January 1 - December 26, 2022, in order by pageviews generated during that timeframe.


1. How to Win Potential Consumers with Customer Journey Mapping on Google

By Christopher Hofman Laursen | April 20, 2022

If your website is like most others, there is likely a mismatch between the content you provide, and what your prospective customers search for on Google. This article is about understanding those potential customers and their conversation with Google by using the customer journey mapping method to provide them with the best content.

2. TikTok SEO: Understanding the TikTok Algorithm

By Lidia Infante | October 26, 2022

In the first chapter of this series, we dug into the search behavior on TikTok and why it should matter to SEOs. In this article, we are going to cover the ins and outs of the TikTok algorithm, and how to leverage it to get more users looking at your brand’s content.

3. Shopify SEO: The Guide to Optimizing Shopify [Updated for 2022]

By Christopher Long | January 25, 2022

Shopify is an increasingly popular platform for e-commerce sites, but it's not fully SEO-friendly out of the box. What's the best way to optimize your Shopify experience for SEO?

4. How to Prep Your SEO Strategy for a New Website

By Adriana Stein | March 16, 2022

Your SEO strategy should be one of the primary considerations before you even start your website. Instead of fighting to make your website SEO-ready later on, start with this holistic SEO checklist for new websites and save yourself valuable time and resources.

5. How to Use Chrome to View a Website as Googlebot

By Alex Harford | August 17, 2022

In this article, Alex shows you how and why to use Google Chrome (or Chrome Canary) to view a website as Googlebot. Viewing a website as Googlebot means we can see discrepancies between what a person sees and what a search bot sees – useful for technical SEO and content audits.

6. How to Optimize for Google's Featured Snippets [Updated for 2022]

By Ann Smarty | November 14, 2022

Google’s featured snippets started as an experiment almost a decade ago. They have since become an integral part of Google’s SERPs, showing up for lots of queries. In fact, featured snippets are now considered organic position #1, so making them part of your SEO strategy is essential to build more traffic.

7. Protect the Hours of Operation on Your GBP from Unwanted Google Edits

By Miriam Ellis | May 9, 2022

Google wants to employ machine learning and AI to alter the hours of operation on twenty million Google Business Profiles as part of their project of creating a “self-updating map”. Google has good reason for pursuing accuracy in their local index, but local business owners have even better reason to be on top of this announcement and proactively safeguard the validity of their own data. 

8. The Top Tech SEO Strategies for 2022 and Beyond

By Crystal Carter | January 27, 2022

Last year was an incredible year for core updates, and for how SEOs improve page quality for users. Moving forward, we can expect to see increased diversification of SERPs — led by developments in Google’s algorithms — and new features from tools like Google Lens. These developments will change how we manage our SEO now and in the future.

9. TikTok SEO: Is TikTok Going to Replace Google?

By Lidia Infante | August 31, 2022

User behavior on TikTok has been evolving as its popularity grows. We’ve seen the app go from dancing teenagers to influencing shopping behavior across the world. Now, the next step for TikTok seems to be turning into the next big search engine. What does it mean for SEOs?

10. 5 Things I Learned About E-A-T by Analyzing 647 Search Results

By Molly Ploe | September 13, 2022

How can SEOs possibly prove to Google, amid all the noise and competition and other experts out there, that their clients deserve a place on Page 1? To find out, Molly compared the top results on hundreds of SERPs to determine what actually proves E-A-T.

11. What Are the Best Tools for Storytelling With Data Visualization?

By Jo Cameron | July 27, 2022

Charts and infographics can be pretty, but if they aren’t also properly breaking down data in a way that makes an impact on the audience, they are likely not worth the time and effort. Below, we discuss how storytelling ties into data visualization, and what tools can help you bring more data into your content.

12. Social Media Competitor Analysis: The Complete Guide

By Sally Ofuonyebi | July 26, 2022

Looking for the steps for performing social media competitor analysis? Here’s a detailed breakdown.

13. How to Improve Organic Clickthrough for Your Content

By Ann Smarty | January 13, 2022

Google search result pages are becoming more diverse and even interactive, which makes any click-through study out there much less reliable, because no two sets of search results are ever the same. So how much control do writers and content creators have over how their content is represented in search? As it turns out, quite a bit!

14. How We Increased Organic Traffic by 65% Using Keyword Research Working Sessions

By Daniel Wood | July 20, 2022

In this blog, Daniel shows you why you should do a keyword research working session with your clients to tap into their expert industry knowledge, and how these sessions helped his team deliver organic traffic growth for one of their new clients with low Domain Authority.

15. Understanding the Google Ads Auction & Why Ad Rank Is Important

By Tanuja Mahdavi | February 16, 2022

When there is a search query on Google, Google Ads runs a quick auction to determine which ads will show for that search query, and what the ad positions should be. This ad auction is repeated every time an ad is eligible to appear for a search term, and is an integral part of the SERP landscape. To help understand it, Tanuja covers the what, how, and why behind the Google Ads auction.

16. How to Earn Topical Authority in 2022 and Beyond

By Zoe Ashbridge | June 8, 2022

Zoe takes a deep dive into topical authority: what it is, how to earn it, and, importantly, how to strategically develop topical relevance.

17. How to Use Keywords to Combine the Power of SEO and Google Ads [Case Study]

By Adriana Stein | July 19, 2022

Both SEO and PPC are used for a common goal — search engine marketing (SEM) — and neither would survive without targeted keywords. Since both strategies have user intent and search demand in mind, you can use them to achieve both short-term and long-term business goals. When approached correctly, using SEO and PPC together can unlock significant opportunities for your brand, so let’s dig in!

18. How We Increased a Client’s Leads by 384% in Six Months by Focusing on One Topic Cluster [Case Study]

By Lydia German | September 26, 2022

Lydia shares the content update process she an her team at Tao Digital Marketing used to generate great results for a client in the financial services niche.

19. The 7-Day Quick Start Guide to SEO + Cheat Sheet

By Cyrus Shepard | October 31, 2022

Unless you work for an agency or want to make a lifelong career out of SEO (an excellent choice), most folks don't learn SEO for the sake of learning SEO, but want the same quick wins that professional SEOs look for. That’s why we created the Quick Start Guide to SEO, which includes seven days of actions to check the SEO health of your site while putting you on the path of sustained improvement. If you want something even more compact, we also created the SEO Quick Start Guide Cheat Sheet.

20. Freshness & SEO: An Underrated Concept

By Christopher Long | July 6, 2022

After working with a news organization and testing the learnings from that work on other sites, Chris and his team started to see the immense power that freshness updates could produce in SEO. In this post, he explains why the entire SEO community has underrated the concept of “freshness”, and how to start optimizing for it.

21. 2022 Local SEO Success: The Year of Everywhere

By Miriam Ellis | January 4, 2022

Take a look back at Miriam's seven local SEO precepts for the 2022, including some expert commentary.

22. The Top 5 Soft Skills SEOs Should Develop

By Petra Kis-Herczegh | September 6, 2022

When it comes to SEO, especially technical SEO, we often talk about the importance of hard skills. And while there’s no doubt that vlookup and regex can be your best friends, there are some essential soft skills to learn that will help you excel in your role and progress in your career.

23. How to Do Better, Lazier Keyword Research

By Tom Capper | September 5, 2022

In this post, Tom expands on one of the points from his 2022 MozCon talk: that a lot of time spent on keyword research is wasted. He’ll go over the three main ways SEOs turn what should be an involved piece of strategic thinking into an overly time-consuming routine, along with what to focus on instead.

24. Transitioning to GA4: Is this the Right Analytics Move for Your Team?

By Sam Torres | July 25, 2022

As you've likely heard, Google plans to fully retire Universal Analytics for GA4. Here's what you should know before making any moves.

25. Beginner's Guide to Google Business Profiles: What Are They, How To Use Them, and Why

By Miriam Ellis | October 17, 2022

Google Business Profile is both a free tool and a suite of interfaces that encompasses a dashboard, in-SERP editing, local business profiles, and a volunteer-driven support forum with this branding. Google Business Profiles and the associated Google Maps make up the core of Google’s free local search marketing options for eligible local businesses. In this guide, we’re doing foundational learning! Share this simple, comprehensive article with incoming clients and team members to get off on the right foot with this important local business digital asset.

Friday, 23 December 2022

The Best of Whiteboard Friday 2022

We had an amazing year of Whiteboard Friday episodes, ranging in topics from link building to content engines to even, basketball?

In case you missed them, here are the top 10 episodes from the year!

1. Estimating Search Opportunity with Robin Lord

Estimating the opportunities within your various SEO efforts is an important component of your analytics, not only to help determine where to focus your energy, but also to prove the potential value of your work to others. In this episode, Robin walks you through a good strategy for this all-important estimative work.

2. Advanced On-Page SEO Optimizations with Chris Long

Typically, when SEOs think about on-page optimizations, they’re thinking about core places to include their target keywords within their content. But how can you take your on-page optimizations to the next level and get beyond some of those basic tactics? In this Whiteboard Friday, Chris Long shows you how.

3. Moneyball is the Future of SEO with Will Critchlow

In our first episode of 2022, Will Critchlow shows you how, much like the NBA, SEO is undergoing an analytic revolution — and how you can make the most of it.

4. Top 4 Things to Know About GA4 with Dana DiTomaso

Dana brings you some details on the exciting new world of Google Analytics 4. Watch and learn how to talk about it when clients and coworkers are intimidated by the move.

5. A Content Engine that Drives Revenue with Ross Simmonds

In this episode, content marketing expert Ross Simmonds walks you through his method for creating a content marketing engine that will ultimately make you money, rooted in four simple steps: research, creation, distribution, and optimization.

6. How to Find Your Real SEO Competitors with Lidia Infante

Competitive research and analysis is a critical component of your SEO strategy. You may have an idea of who your business competitors are, but your real SEO competitors are the ones who target the same keywords, speak to the same audience, and solve for the same consumer needs. In this Whiteboard Friday, Lidia Infante walks you through two approaches to find out who those competitors are.

7. How to Measure Content Engagement with Dana DiTomaso

When it comes to content engagement, you can (and should) be measuring more than just page views. Analytics expert Dana DiTomaso summarizes her MozCon 2022 presentation by sharing the four things you should focus on to make sure your metrics are giving you the best picture of your content's quality.

8. Metrics for Better Keyword Research with Tom Capper

Many SEOs think of keyword research as a very basic part of SEO, which can actually be a problem. In this episode, Tom explains some of the common mistakes SEOs make when doing keyword research that are easy to fix, many of which come from metrics like search volume, click-through rate, and difficulty.

9. Visual Search Optimization with Crystal Carter

In this episode, Crystal Carter talks you through the different optimizations that you can make for visual search, and the kinds of results that you might see for visual search content.

10. The Authoritative Content Funnel with Amanda Milligan

Finishing the top 10 list, digital marketing expert Amanda Milligan walks you through the three parts that make up a content funnel for building authority, as well as the types of content that fit into each one.

Wednesday, 21 December 2022

Daily SEO Fix: Monitoring Local Markets

Almost every search we make via Google includes some degree of localization. So, how can we keep an eye on our site’s performance within local markets to ensure we’re continuing to show up in relevant searches? 

In this edition of the Daily SEO Fix, we’ll look at how the Moz suite of tools can help you monitor how your site is performing in local markets.

Tracking Market-Based Rankings

The first step in monitoring local market performance is tracking keyword rankings locally. Within Moz Pro you can track market-based rankings by city name or postal code.

In this video, Emilie will walk through two ways to add locally tracked keywords to your Moz Pro Campaign.


Preview Localized SERPs

When actively monitoring a site’s performance in search results, it can be helpful to view the SERP itself to see how many of your pages are ranking, where they are ranking, and how this compares to your competitors’ performance.

In this video, Emilie will illustrate how to use the Analyze a Keyword tool within Moz Pro Campaigns to view the current, local SERP for your tracked keywords. This tool will show you the top 50 organic results for your locally tracked keywords along with their Domain Authority, Page Authority, and Page Optimization score for further analysis.


Segment Keywords By Market

Now you’re tracking keywords locally and you know how to view the current SERP, but what if you want to see and compare Search Visibility and rankings for multiple markets? Or what if you have markets which include multiple cities or postal codes? For example, let’s say you’ve opened up multiple stores in the San Francisco Bay Area and want to see how your site is performing in that region. There are multiple cities and postal codes within the wider San Francisco Bay Area. How can you monitor performance in this market and compare it to other markets?

In this video, Emilie will show you how to label and segment your keyword data by market within Moz Pro along with how to compare performance by market, side-by-side.


Track Competitors

As Lidia Infante recently noted in her SEO Gap Analysis edition of Whiteboard Friday, “ranking on Google is not ranking in a vacuum. Ranking is outranking your competitors.” So it stands to reason that tracking your competitors' rankings on a local level is an important part of any strategy around monitoring local markets. But how do you do that with the Moz tools?

In this video, Arian will walk through how to view competitor rankings on a local level within your Moz Pro Campaign.


STAT: Accessing the Local Pack Report

Local Packs are local-specific SERP features which feature up to 3 local businesses. They are incredibly competitive and are a critical component of any local SEO strategy. So how can you keep an eye on how your business is showing up in a local pack?

In this video, Emilie will show you how the STAT tools can be instrumental in monitoring local pack performance. She will show you how to set up a Local Pack report right within the tool.


Now that you have the tools to start monitoring local markets, it’s time to get out there and try it for yourself! Be sure to check out the Moz Help Hub and STAT Knowledge Base for additional resources and help. And keep an eye out for our next edition of the Daily SEO Fix.

Tuesday, 20 December 2022

SEO Recap: PageRank

Have you ever wondered how Moz employees learn internally? Well, here’s your chance to get a sneak peek into never seen before, internal webinar footage with Tom Capper! Learning is important at Moz, and the sharing of information amongst employees is crucial in making sure we stay true to our core values. Knowledge sharing allows us to stay transparent, work together more easily, find better ways of doing things, and create even better tools and experiences for our customers.

Tom started these sessions when everyone was working remotely in 2020. It allowed us to come together again in a special, collaborative way. So, today, we give to you all the gift of learning! In this exclusive webinar, Tom Capper takes us through the crucial topic of PageRank.

Video Transcription

This is actually a topic that I used to put poor, innocent, new recruits through, particularly if they came from a non-marketing background. Even though this is considered by a lot people to be an advanced topic, I think it's something that actually it makes sense for people who want to learn about SEO to learn first because it's foundational. And if you think about a lot of other technical SEO and link building topics from this perspective, they make a lot more sense and are simpler and you kind of figure out the answers yourself rather than needing to read 10,000 word blog posts and patents and this kind of thing.

Anyway, hold that thought, because it's 1998. I am 6 years old, and this is a glorious state-of-the-art video game, and internet browsing that I do in my computer club at school looks a bit like this. I actually didn't use Yahoo!. I used Excite, which in hindsight was a mistake, but in my defense I was 6.

The one thing you'll notice about this as a starting point for a journey on the internet, compared to something like Google or whatever you use today, maybe even like something that's built into your browser these days, there is a lot of links on this page, and mostly there are links to pages with links on this page. It's kind of like a taxonomy directory system. And this is important because if a lot of people browse the web using links, and links are primarily a navigational thing, then we can get some insights out of looking at links.

They're a sort of proxy for popularity. If we assume that everyone starts their journey on the internet on Yahoo! in 1998, then the pages that are linked to from Yahoo! are going to get a lot of traffic. They are, by definition, popular, and the pages that those pages link to will also still get quite a lot and so on and so forth. And through this, we could build up some kind of picture of what websites are popular. And popularity is important because if you show popular websites to users in search results, then they will be more trustworthy and credible and likely to be good and this kind of thing.

This is massive oversimplification, bear with me, but this is kind of why Google won. Google recognized this fact, and they came up with an innovation called PageRank, which made their search engine better than other people's search engines, and which every other search engine subsequently went on to imitate.

However, is anything I said just now relevant 23 years later? We definitely do not primarily navigate the word with links anymore. We use these things called search engines, which Google might know something about. But also we use newsfeeds, which are kind of dynamic and uncrawlable, and all sorts of other non-static, HTML link-based patterns. Links are probably not the majority even of how we navigate our way around the web, except maybe within websites. And Google has better data on popularity anyway. Like Google runs a mobile operating system. They run ISPs. They run a browser. They run YouTube. There are lots of ways for Google to figure out what is and isn't popular without building some arcane link graph.

However, be that true or not, there still is a core methodology that underpins how Google works on a foundational level. In 1998, it was the case that PageRank was all of how Google worked really. It was just PageRank plus relevance. These days, there's a lot of nuance and layers on top, and even PageRank itself probably isn't even called that and probably has changed and been refined and tweaked around the edges. And it might be that PageRank is not used as a proxy for popularity anymore, but maybe as a proxy for trust or something like that and it has a slightly different role in the algorithm.

But the point is we still know purely through empirical evidence that changing how many and what pages link to a page has a big impact on organic performance. So we still know that something like this is happening. And the way that Google talks about how links work and their algorithms still reflects a broadly PageRank-based understanding as do developments in SEO directives and hreflang and rel and this kind of thing. It still all speaks to a PageRank-based ecosystem, if not a PageRank-only ecosystem.

Also, I'm calling it PageRank because that's what Google calls it, but some other things you should be aware of that SEOs use, link equity I think is a good one to use because it kind of explains what you're talking about in a useful way. Link flow, it's not bad, but link flow is alluding to a different metaphor that you've probably seen before, where you think of links as being sent through big pipes of liquids that then pour in different amounts into different pages. It's a different metaphor to the popularity one, and as a result it has some different implications if it's overstretched, so use some caution. And then linking strength, I don't really know what metaphor this is trying to do. It doesn't seem as bad as link juice, at least fine, I guess.

More importantly, how does it work? And I don't know if anyone here hates maths. If you do, I'm sorry, but there's going to be maths.

So the initial sort of question is or the foundation of all this is imagine that, so A, in the red box here, that's a web page to be clear in this diagram, imagine that the whole internet is represented in this diagram, that there's only one web page, which means this is 1970 something, I guess, what is the probability that a random browser is on this page? We can probably say it's one or something like that. If you want to have some other take on that, it kind of doesn't matter because it's all just going to be based on whatever number that is. From that though, we can sort of try to infer some other things.

So whatever probability you thought that was, and let's say we thought that if there's one page on the internet, everyone is on it, what's the probability a random browser is on the one page, A, links to? So say that we've pictured the whole internet here. A is a page that links to another page which links nowhere. And we started by saying that everyone was on this page. Well, what's the probability now, after a cycle, that everyone will be on this page? Well, we go with the assumption that there's an 85% chance, and the 85% number comes from Google's original 1998 white paper. There's an 85% chance that they go onto this one page in their cycle, and a 15% chance that they do one of these non-browser-based activities. And the reason why we assume that there's a chance on every cycle that people exit to do non-browser-based activities, it's because otherwise we get some kind of infinite cycle later on. We don't need to worry about that. But yeah, the point is that if you assume that people never leave their computers and that they just browse through links endlessly, then you end up assuming eventually that every page has infinite traffic, which is not the case.

That's the starting point where we have this really simple internet, we have a page with a link on it, and a page without a link on it and that's it. Something to bear in mind with these systems is, obviously, web pages don't have our link on them and web pages with no links on them are virtually unheard of, like the one on the right. This gets really complex really fast. If we try to make a diagram just of two pages on the Moz website, it would not fit on the screen. So we're talking with really simplified versions here, but it doesn't matter because the principles are extensible.

So what if the page on the left actually linked to two pages, not one? What is the probability now that we're on one of those two pages? We're taking that 85% chance that they move on at all without exiting, because the house caught fire, they went for a bike ride or whatever, and we're now dividing that by two. So we're saying 42.5% chance that they were on this page, 42.5% chance they were on this page, and then nothing else happens because there are no more links in the world. That's fine.

What about this page? So if this page now links to one more, how does this page's strength relates to page A? So this one was 0.85/2, and this one is 0.85 times that number. So note that we are diluting as we go along because we've applied that 15% deterioration on every step. This is useful and interesting to us because we can imagine a model in which page A, on the left, is our homepage and the page on the right is some page we want to rank, and we're diluting with every step that we have to jump to get there. And this is crawl depth, which is a metric that is exposed by Moz Pro and most other technical SEO tools. That's why crawl depth is something that people are interested in is this, and part of it is discovery, which we won't get into today, but part of it is also this dilution factor.

And then if this page actually linked to three, then again, each of these pages is only one-third as strong as when it only linked to one. So it's being split up and diluted the further down we go.

So that all got very complicated very quick on a very simple, fictional website. Don't panic. The lessons we want to take away from this are quite simple, even though the math becomes very arcane very quickly.

So the first lesson we want to take is that each additional link depth diluted value. So we talked about the reasons for that, but obviously it has implications for site structure. It also has implications in some other things, some other common technical SEO issues that I'll cover in a bit.

So if I link to a page indirectly that is less effective than linking to a page directly, even in a world where every page only has one link on it, which is obviously an ideal scenario.

The other takeaway we can have is that more links means each link is less valuable. So if every additional link you add to your homepage, you're reducing the effectiveness of the links that were already there. So this is very important because if you look on a lot of sites right now, you'll find 600 link mega navs at the top of the page and the same at the bottom of the page and all this kind of thing. And that can be an okay choice. I'm not saying that's always wrong, but it is a choice and it has dramatic implications.

Some of the biggest changes in SEO performance I've ever seen on websites came from cutting back the number of links on the homepage by a factor of 10. If you change a homepage so that it goes from linking to 600 pages to linking to the less than 100 that you actually want to rank, that will almost always have a massive difference, a massive impact, more so than external link building could ever dream of because you're not going to get that 10 times difference through external link building, unless it's a startup or something.

Some real-world scenarios. I want to talk about basically some things that SEO tools often flag, that we're all familiar with talking about as SEO issues or optimizations or whatever, but often we don't think about why and we definitely don't think of them as being things that hark back quite so deep into Google's history.

So a redirect is a link, the fictional idea of a page with one link on it is a redirect, because a redirect is just a page that links to exactly one other page. So in this scenario, the page on the left could have linked directly to the page on the top right, but because it didn't, we've got this 0.85 squared here, which is 0.7225. The only thing you need to know about that is that it's a smaller number than 0.85. Because we didn't link directly, we went through this page here that redirected, which doesn't feel like a link, but is a link in this ecosystem, we've just arbitrarily decided to dilute the page at the end of the cycle. And this is, obviously, particularly important when we think about chain redirects, which is another thing that's often flagged by the SEO tools.

But when you look in an issue report in something like Moz Pro and it gives you a list of redirects as if they're issues, that can be confusing because a redirect is something we're also told is a good thing. Like if we have a URL that's no longer in use, it should redirect. But the reason that issue is being flagged is we shouldn't still be linking to the URL that redirects. We should be linking directly to the thing at the end of the chain. And this is why. It's because of this arbitrary dilution that we're inserting into our own website, which is basically just a dead weight loss. If you imagine that in reality, pages do tend to link back to each other, this will be a big complex web and cycle that is, and I think this is where the flow thing comes around because people can imagine a flow of buckets that drip round into each other but leak a little bit at every step, and then you get less and less water, unless there's some external source. If you imagine these are looping back around, then inserting redirects is just dead weight loss. We've drilled a hole in the bottom of a bucket.

So, yeah, better is a direct link. Worse is a 302, although that's a controversial subject, who knows. Google sometimes claim that they treat 302s as 301s these days. Let's not get into that.

Canonicals, very similar, a canonical from a PageRank perspective. A canonical is actually a much later addition to search engines. But a canonical is basically equivalent to a 301 redirect. So if we have this badgers page, which has two versions, so you can access it by going to badgers?colour=brown. Or so imagine I have a website that sells live badgers for some reason in different colors, and then I might have these two different URL variants for my badger e-com page filtered to brown. And I've decided that this one without any parameters is the canonical version, literally and figuratively speaking. If the homepage links to it via this parameter page, which then has canonical tag pointing at the correct version, then I've arbitrarily weakened the correct version versus what I could have done, which would be the direct link through. Interestingly, if we do have this direct link through, note that this page now has no strength at all. It now has no inbound links, and also it probably wouldn't get flagged as an error in the tool because the tool wouldn't find it.

You'll notice I put a tilde before the number zero. We'll come to that.

PageRank sculpting is another thing that I think is interesting because people still try to do it even though it's not worked for a really long time. So this is an imaginary scenario that is not imaginary at all. It's really common, Moz probably has this exact scenario, where your homepage links to some pages you care about and also some pages you don't really care about, certainly from an SEO perspective, such as your privacy policy. Kind of sucks because, in this extreme example here, having a privacy policy has just randomly halved the strength of a page you care about. No one wants that.

So what people used to do was they would use a link level nofollow. They use a link level nofollow, which . . . So the idea was, and it worked at the time, and by at the time, I mean like 2002 or something. But people still try this on new websites today. The idea was that effectively the link level nofollow removed this link, so it was as if your homepage only linked to one page. Great, everyone is a winner.

Side note I talked about before. So no page actually has zero PageRank. A page with no links in the PageRank model has the PageRank one over the number of pages on the internet. That's the seeding probability that before everything starts going and cycles round and figures out what the stable equilibrium PageRank is, they assume that there's an equal chance you're on any page on the internet. One divided by the number of pages on the internet is a very small number, so we can think of it as zero.

This was changed, our level nofollow hack was changed again a very, very long time ago such that if you use a link level nofollow, and by the way, this is also true if you use robots.txt to do this, this second link will still be counted in when we go here and we have this divided by two to say we are halving, there's an equal chance that you go to either of these pages. This page still gets that reduction because it was one of two links, but this page at the bottom now has no strength at all because it was only linked through a nofollow. So if you do this now, it's a worst of both world scenario. And you might say, "Oh, I don't actually care whether my privacy policy has zero strength," whatever. But you do care because your privacy policy probably links through the top nav to every other page on your website. So you're still doing yourself a disservice.

Second side note, I said link level nofollow, meaning nofollow in the HTML is an attribute to a link. There is also page level nofollow, which I struggled to think of a single good use case for. Basically, a page level nofollow means we are going to treat every single link on this page as nofollow. So we're just going to create a PageRank dead-end. This is a strange thing to do. Sometimes people use robots.txt, which basically does the same thing. If I block this page with robota.txt, that's the same in terms of the PageRank consequences, except there are other good reasons to do that, like I might not want Google to ever see this, or I might want to prevent a massive waste of Google's crawlers' time so that they spend more time crawling the rest of my site or something like this. There are reasons to use robots.txt. Page level nofollow is we're going to create that dead-end, but also we're going to waste Google's time crawling it anyway.

Some of the extreme scenarios I just talked about, particularly the one with the privacy policy, changed a lot for the better for everyone in 2004 with something called reasonable surfer, which you occasionally still hear people talking about now, but mostly implicitly. And it is probably actually an under-discussed or underheld in mind topic.

So these days, and by these days, I mean for the last 17 years, if one of these links was that massive call to action and another one of these links was in the footer, like a privacy policy link often is, then Google will apply some sense and say the chance people click on this one . . . Google was trying to figure out probabilities here, remember. So we'll split this. This 0.9 and 0.1 still have to add up to 1, but we'll split them in a more reasonable fashion. Yeah, they were doing that a long time ago. They've probably got very, very good at it by now.

Noindex is an interesting one because, traditionally, you would think that has nothing to do with PageRank. So, yeah, a noindex tag just means this should never show up in search results, this page at the bottom, which is fine. There are some valid reasons to do that. Maybe you're worried that it will show up for the wrong query that something else on your site is trying to show up for, or maybe it contains sensitive information or something like this. Okay, fine. However, when you put a noindex tag on something, Google eventually stops crawling it. Everyone sort of intuitively knew all the pieces of this puzzle, but Google only acknowledged that this behavior is what happens a couple of years ago.

So Google eventually stops crawling it, and when Google stops crawling on it, it stops passing PageRank. So noindex follow, which used to be quite a good thing or we thought quite a good thing to do for a page like an HTML sitemap page or something like that, like an HTML sitemap page, clearly you don't want to show up in search results because it's kind of crap and a poor reflection on your site and not a good UX and this kind of thing. But it is a good way to pass equity through to a bunch of deep pages, or so we thought. It turns out probably not. It was equivalent to that worst case scenario, page level nofollow in the long run that we talked about earlier. And again, this is probably why noindex is flagged as an error in tools like Moz Pro, although often it's not well explained or understood.

My pet theory on how links work is that, at this stage, they're no longer a popularity proxy because there's better ways of doing that. But they are a brand proxy for a frequently cited brand. Citation and link are often used synonymously in this industry, so that kind of makes sense. However, once you actually start ranking in the top 5 or 10, my experience is that links become less and less relevant the more and more competitive a position you're in because Google has increasingly better data to figure out whether people want to click on you or not. This is some data from 2009, contrasting ranking correlations in positions 6 to 10, versus positions 1 to 5. Basically, both brand and link become less relevant, or the easily measured versions become less relevant, which again is kind of exploring that theory that the higher up you rank, the more bespoke and user signal-based it might become.

This is some older data, where I basically looked at to what extent you can use Domain Authority to predict rankings, which is this blue bar, to what extent you could use branded search volume to predict rankings, which is this green bar, and to what extent you could use a model containing them both to predict rankings, which is not really any better than just using branded search volume. This is obviously simplified and flawed data, but this is some evidence towards the hypothesis that links are used as a brand proxy.

Video transcription by Speechpad.com

Monday, 19 December 2022

12 Local Search Developments You Need to Know About from Q4 2022

12 Local Search Developments You Need to Know About from Q4 2022

Hard to believe but neither Q1, nor Q2, nor Q3 can equal Q4 for the activity we’ve seen in local search, and the quarter isn’t even quite over yet! For all I know, Google could celebrate New Year’s Eve by renaming Google Business Profiles “Google Plus Places My Business Profiles Merchant Experience Listings” and we would just have to roll with that, too.

There has been so much going on, it’s small wonder if you haven’t caught every development, but here’s a list of some of the most interesting ones you should be aware of as we look towards 2023. We’ve got one new interface, two awful bugs, three new GBP features, four review developments, several guideline updates … all that’s missing is the partridge in the pear tree!

The new merchant experience breaks upon us like a thunderclap

Watch this space for a vast post from me on the NMX in the next few weeks, but for now, you need to know that the editing via the old Google Business Profile Manager dashboard is a thing of the past and you’ll need to manage your data in the new in-SERP interface that’s been dubbed the “New Merchant Experience”.

About a year ago, Google warned us that this was coming, but it must be said that they also intimated that this change would only impact single location businesses. In Q4, a hue and cry understandably went up from the local SEO community when everyone – including multi-location listings managers – woke up to find themselves summarily escorted out of the old dashboard and into the SERPs for management.

The good folks at Bright Local, Streetfight, and Online Ownership have done a great job of early reporting on the frustrations and discoveries surrounding the NMX, and for the most part, have concluded that if you click around enough within the new interface, you will relocate most features. A few new surprises have been noted so far. For example, Q&A is now part-and-parcel of the interface instead of being treated as a separate instance:

And check out Khushal Bherwani’s tweet capturing Google tagging the previous location of a business under the “locations” tab of the NMX when the company’s location has been changed.

We can get used to the NMX. We should also expect changes to it in the new year, but at the moment, my most interesting industry takeaway from the deprecation of the historic dashboard is that listings management software just became more appealing. BrightLocal’s informal poll captures how clunky many users will find the act of trying to manage listings amid the clutter of the organic SERPs:

In recent years, there has been some debate about whether local business should pay for listings management software. Google’s latest move is making experts like Mike Blumenthal and Carrie Hill say “yes” if you’ve got multiple listings and require the calm and quiet organization of a dedicated listings management dashboard instead of the awkward mess of the NMX.

Google bugs only an entomologist could love

David Mihm captures some of the industry angst many are feeling right now as a result of multiple Google bugs making work needlessly difficult for us in recent months. For the record, I love and appreciate all insects, but Google’s listing suspension spree has been about as fun as finding potato bugs in one’s bathtub. When even the smallest of normal edits to listings (like writing a post or editing a description) results in suspension, it can make local SEOs and local business owners very leery of keeping their listings updated:

Fortunately, about one month after reports of suspensions began flooding fora, Joy Hawkins announced the good news that Google had apparently resolved this bug.

This is a good time and place to mention Amy Toman’s reinstatement request tip:

And also, that Colan Nielsen spotted what appears to be a new notification from Google in the NMX of how long it should take for your edits to be reviewed:

Meanwhile, a second bug began chasing us all around the local picnic table in the form of a big wave of review loss. If you’ve recently lost a ton of reviews, Mike Blumenthal has done outstanding investigative reporting at Near Media on this latest aggravation, including his finding that Google had been auto-updating Google Business Profiles and changing their CID numbers right before reviews were thrown out. As he says,

“Changing the CID and losing reviews with a Suggested Edit update is a new and disturbing bug…You should always capture and store your Google CID and Place ID somewhere safe. Gatherup's Google Review Link Generator Chrome extension helps you get those numbers easily as does Pleper's free Google CID converter.”

I also highly recommend reading Mike’s article examining the difference between a review bug and a review filter and outlining steps you can take in the wake of review loss.

New Google features we don’t dislike

Barry Schwartz captured a new feature test that several people had noticed in which a speaker icon reads out the name and category of the local business. I’m not sure where Google is headed with this, but I am a fan of audio features as an alternative to too much screen time.

Stefan Somborac notes a nifty feature, referenced in this Google help doc, that lets dining establishments select their preferred menu. Also new for restaurants, Abner Li wrote up the “Nearby Dishes” US rollout from Google that can return a carousel of local options to you when you search for something like “pho near me”. I have yet to see this feature in the wild, but Abner’s article has screenshots.

Good review things!

Darren Shaw was jubilant at finding something truly new in local SEO – this time, a notation of the number of times a Google reviewer profile had published reviews in a specific city. This sparked a great discussion between Greg Sterling and Mike Blumenthal as to whether this signal will actually boost the authenticity of Google-based reviews, or whether location is too easy to spoof. I like this feature because it adds some transparency to the fact that Google is tracking your location when you leave a review – which might come as a surprise to some users. Perhaps this might be a minor deterrent to some forms of review spam?

Next up, an amazing find from Christina LeVasseur Brodzky for the hospitality industry of Google quantifying the positive and negative sentiment in association with place topics within reviews. In her example, 72% of reviewers favorably mentioned the bar at a hotel, while 17% were not so favorable. This rollout showcases the deepening levels Google is reaching in sentiment analysis.

And to round up review developments, Q4 saw the publication of two major review studies. Moz’s own, The Impact of Local Business Reviews on Consumer Behavior will take you through three chapters of insights into the habits of review readers, review writers, and successful owner responses based on a large-scale survey. Meanwhile, the good folks at SOCi have a gated report on The State of Google Reviews based on an analysis of nearly five million reviews. Also, the team over at Sterling Sky has been publishing a series of small and interesting studies on the impacts of review recency, number, text, and diversity on local pack rankings. If Google will just stop accidentally deleting local business reviews and let us get on with things, all of these reports will seriously power up your reputation strategy for 2023.

Guideline updates should always be noted

In the aforementioned Moz review survey, we learned that the next step 51% of consumers take after reading reviews will land them on local business websites. Given this, it’s quite relevant to local business owners and marketers that Google has replaced its historic Webmaster Guidelines with the overhauled, rebranded Google Search Essentials. This would be a good time to read through the refreshed guidelines to be sure your website is being understood by people and search engines alike.

And finally, Colan Nielsen took note of Google adding a stern warning against review gating back into their Prohibited and Restricted Content guidelines. In sum, don’t ever ask customers to specifically leave you a positive review, don’t use software that weeds out negative sentiment, and if you publish first-party reviews on your own website, don’t show only the good stuff. Be honest and authentic, and you should be fine.

And that puts a bow on local SEO 2022! I want to thank everyone who has read this new quarterly series this year and who has tweeted it, blogged about it, included it in your newsletters, and discussed it on your podcasts. Warmest gratitude, as well, to each of the local SEO community members giving time every month of every year to freely sharing your discoveries with all of us. I hope to continue this series in 2023 and to keep learning local with all of you!

Friday, 16 December 2022

How to Optimize Existing Content for Featured Snippets — Whiteboard Friday

In our last episode of 2022, Crystal discusses how featured snippets show up in several different parts of the SERP, giving you lots of good value for organic reach, and how to claim those opportunities for your existing content.

whiteboard outlining tips for winning featured snippets with existing content

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Hi, my name is Crystal Carter. Welcome to my Whiteboard Friday. I am the Head of SEO Communications at Wix, and today I'm going to talk to you about featured snippets. Specifically, I'm going to talk to you about how you can get featured snippets for existing content.

Now, before I get into that, I'd like to talk to you about why we should be thinking about featured snippets for existing content, and that's because featured snippets show up in lots of different parts of the SERP. So they give you lots of good value for organic reach.

If you're not sure what a featured snippet is, if you Google something like, "what is a featured snippet," you actually get a featured snippet, and it's a sort of extract of text from a website. And underneath of it, it'll say that it's a featured snippet, and that's how you know it's a featured snippet.

Now, the content from a featured snippet can show in lots of different parts of search when you're online. So it will show in the neat featured snippet, and it might include an image, it might include a paragraph, it might include some other elements, which I'll go into a little bit later. But it also might show in a featured snippet dropdown. So featured snippets are constantly changing. All of the time they're adding new features and trialing new features. But one of the features that we've seen, for instance, if you look up what is a featured snippet, I've seen it before, where it has the main paragraph, which says what a featured snippet is, and then it'll break it down further, so how to get a featured snippet and how to optimize for that sort of thing. So it will break it down into different sections, and under each section, there's more content, which if you click on it, leads you to another featured snippet.

There's also the People Also Ask section of Google. So under People Also Ask, there might be, more questions about a given topic, particularly if you're searching for a head term, like say, I don't know, shoes, there might be sort of: What shoes should I buy? Or where can I get shoes? Or is a sandal a shoe? That sort of thing might show up in People Also Ask. Those often contain featured snippets as well.

You might also get a featured snippet when you make a voice query. You might also get a featured snippet when you search on Google Lens.

You might also get the content from a featured snippet within the from sources across the web accordions, which show up for certain topic queries. So, for instance, if you were to query something like "Seven Wonders of the World," you might get something that says, "From sources around the web," and has sort of the Pyramids of Giza. And then if you click on that, there'll be lots of little bits of content, and those will often be pulled from featured snippets.

You may also see a featured snippet under a knowledge panel dropdown. So, for instance, if you were to look up, again, something like the Pyramids of Giza, it might have a section about different parts of that particular query, and those will include different dropdowns, and under those dropdowns, you will often see featured snippet content. And, of course, the knowledge panel will very often pull content from featured snippets.

So, with all of that opportunity to gain organic reach and to gain organic visibility, it's really worth optimizing content for featured snippets.

Now, if you have content that you think has the potential to get a featured snippet, i.e., that it appeals to some good, juicy long tail keywords, and that it's of high quality, you may very well think, "Oh, this particular search result that this page is ranking for has a featured snippet. Maybe I could get it." If you're not sure where to start about how to get on that featured snippet, then here's a few tips that tend to work.

So this is a little bit of a decision tree, and I'm just going to talk you through a little bit of it. So first thing you should think about is your formatting, so headers. This isn't rocket science. This is classic SEO tooling. So your headers, your H1s, your H2s, your H3, are they relevant? Are they present? Are they breaking down the content in lots of good ways? If they are not, then you should update that. Okay, if they are, if you do have headers, H1s, H2s, then make sure that they've got relevant terminology in them. We can think keywords, but also you can think more naturally, like natural language. Natural language processing is becoming a lot more intuitive within Google, and they're able to understand lots of nuances of language. So make sure that you've including something that's relevant, that continues to tell the story of whatever the topic is that you're covering.

Another thing that you can think about, and this also helps you with People Also Ask, is whether or not you can phrase the headers as questions. So, for instance, if you were to think about the pyramids, for instance, you might say, "When is a good time to go to visit the pyramids?" for instance. That might be something that you might want to include. If you can write it as a question, sometimes that can help with queries, like, for instance, within the voice search and also with People Also Ask.

Now, once you've done all of that, you want to go to the next thing. So if you go to headers and you say, "Yes, we've done all of this," then we look at facts and data sets. When I say facts and data sets, I don't mean like massive, huge data sets and like 17 spreadsheets with lots of tabs and VLOOKUPs and all that stuff. I mean lists. For instance, if you were to say, "Should I wear sandals or should I wear sneakers," let's say, that would be something that you could compare and you could say, "Well, is the weather like this? Or is this like that? Or is that like that? Or is that like that?" That's a set of information that you could put on a table, for instance. Or if you were to say, "What different types of shoes are there, sandals, heels, trainers whichever," then that's also a type of data set. So if you have data, if you have a list within your content, then you might want to think about whether or not you could include more lists or more in your content, and that might be something you want to think about.

If you are doing that, then you want to think about including HTML lists. So not just listed in the paragraph, but also listed out specifically within lists. These can be as ordered lists or as unordered lists. So an unordered list within HTML has an HTML tag of ul, and then you would be able to make that as bullet points. If you're able to put in an ordered list, one, two, three, four, like top 10 tips for XYZ, then you would be able to put that in as an ordered list, and that's something that you could put in your HTML. What happens with Google, with their featured snippets is sometimes they pull out the list and they will put that onto the featured snippet there. So if you have it in your content, then you're more likely to show for featured snippets.

Another thing you can think about is tables. So I mentioned some of the tables. Google sometimes within their featured snippets will show the tables in the SERP. So if you have content that could be put into a table, it's worth thinking about that as well.

So when you're thinking about your data sets and when you're thinking about the data and the information that you have in your content, think about if there's a way that you can make it more snackable with ordered lists. Think about if there's a way that you can make it more snackable with tables. If you haven't done that yet, then you should update your content so that it includes that. If you have done that, then you can move on to the next step.

So the next step is to think about relevant pictures. Does it contain relevant pictures? This can be an illustration, or this can be photos of whatever it is you're talking about. Sometimes when I speak to people about this, they say, "Oh, well, we're in a vertical that doesn't have a lot of photos." That's fine. That's okay. You can make infographics as well. You can make diagrams as well. Those can help you for featured snippets. They can also help you with lots of other parts of the SERP. One of the features that shows up very regularly within featured snippets is an image carousel. Sometimes Google will mix and match the content in them. So, for a featured snippet, you might see a paragraph from one website and an image from another one. So even if you have the featured snippet, you might not have all of the elements of the featured snippet.

Now, if you do have an image on your featured snippet, then you're more likely to show for both the paragraph and the image. So if you don't have images on your content, you should add them. If you're not able to take a photo, then you should create a diagram that is relevant to your content.

The other one you want to think about is making sure that any images that you have on your content include relevant attributes, like your alt tags, your title names, your file names, your file formats for your images, so that they can be indexed and crawled correctly. Also you want to think about entities, any images on your website that contain relevant entities, which can also help you with visual search. Shout-out to my last Whiteboard Friday on that.

If you've done all of that, then you want to think about making sure that you have schema markup on your images, because that will also give you another element. Now, just a slight sidebar, schema markup is not a part of the criteria for featured snippets. You do not need to have schema markup on your website to get a featured snippet. However, anecdotally, a lot of websites which have schema markup tend to perform well for featured snippets. So what the schema does is it helps Google to understand your image better, which makes it more likely for it to show an image search, which makes it more likely to show in a featured snippet.

So if you don't have these things in place, then you should add those to your content as well. If you do, then I would say that it's worth making sure that you are keeping an eye on your featured snippets and keeping an eye on your content to see if there's any of these things that can be optimized as you go along.

Once you've done that, it's worth checking where your content is being distributed, whether you're showing for a featured snippet, whether you're showing for People Also Ask, whether there's other parts of the SERP that you might be featured on so that you can potentially build on that with either additional content or with enriching this content further.

Those are my recommendations for how to optimize existing content for featured snippets.

Video transcription by Speechpad.com

Wednesday, 14 December 2022

13 Age-based Local Business Review Preferences You Can Serve

Image credit: Mitchell Joyce

Today, we’ll be learning more about customer preferences by age group surrounding local business reviews, taking a deeper dive into some of the data from The Impact of Local Business Reviews on Consumer Behavior | SEO Industry Report. In our initial report, we covered the leading characteristics of customers as a whole, but here, we’ll surface some intriguing differences that appeared when we segmented survey responses by age.

I want to preface this by stating that age discrimination of every kind is unacceptable. I’m not a fan of the fight over crumbs that underlies divisive and disrespectful slogans involving “okays” and “boomers” or “millennials” and “avocado toast”. Particularly in the US, these types of groupings only serve to divide and dishonor friends, family, and neighbors. Instead, let’s look with respect at the preferences of local business customers when it comes to reading and writing reviews so that we can operate and market local brands to suit the needs and tastes of lots of people in our communities. Honoring everyone is the best basis for great customer service.

Similar review habits and preferences

Image credit: Steve Bailey

Breaking down the survey by age groups of 18-29, 30-60, and 61+, we saw more commonalities than differences in behaviors and preferences surrounding reviews. For example:

  • About ⅓ of all three groups say their commonest habit is to read reviews on a weekly basis

  • A little over ½ of all three groups say reviews are somewhat important in the process of deciding whether a business can be trusted

  • About ½ of all three groups visit the business website as their next step after reading enough positive reviews of a brand, about ⅓ of the youngest and eldest groups say their next step is to visit the business in person, with a ¼ of the middle group doing the same.

  • Over ½ of all three groups will definitely seek out a business if its owner responses to reviews resolve stated problems, with the two older groups being slightly more willing to do so than the youngest group.

  • About ½ of all three groups require a minimum 4 star rating to consider doing business with a local brand, with the eldest group having slightly higher expectations than the two younger groups.

  • About ⅓ of all three groups say they will “sometimes” leave a review when asked to do so.

Different review habits and preferences by age group

Image credit: GT#4

For the purposes of this column, Group A is people aged 18-29, Group B is people aged 30-60, and Group C is people aged 61+.

1. Older Americans write fewer reviews

When asked how often they write reviews, about ¼ of Groups A and B say they only write reviews a few times a year. Most of them are more active review writers than this. However, 43% of Group C falls into the category of only writing reviews a few times a year. Brands may have to work harder to build up their online reputation if their model relies heavily on the patronage of older customers.

2. Older Americans are less tied to Google reviews

A little over 80% of both Groups A and B say they spend the majority of their time reading local business reviews on Google. Interestingly, that number drops to just 62% for Group C, with older Americans having more diverse reading habits that span platforms like the BBB, Yelp, Nextdoor, Facebook and first-party reviews on local business websites. Local brands that rely on the patronage of older customers should be sure to be managing reputation across a wide variety of platforms.

3. Younger Americans trust social media more as a source of local business reputation

When asked which sources, other than local business reviews, respondents rely on to understand local business reputation, a little over 60% of Groups A and B cite friends and family, while an even greater percentage (74%) turn to this resource. 61% of the youngest group relies on social media, a slightly smaller 57% of the middle group does so, but a significantly smaller 43% of the oldest group does so. Meanwhile, an identical 43% of Groups A and B consult the business’ own website as their next choice, but for Group C, 44% turn to the Better Business Bureau. Local brands should note here that younger Americans are skewed more towards social media information, while older Americans still place more trust on established platforms like the BBB.

4. Younger Americans prefer SMS-based review requests over print

About 1/2 of all three groups cite email as their #1 preference for receiving review requests and in-person requests come second for everybody. However, whereas the third choice for Groups A and B is SMS/text-based review asks, Group C prefers to be asked for reviews via receipts, invoices and other print materials. This is an important divide, and while I’ll say that, in my own experience, some of my elders text me more than my nieces and nephews, it’s clear that local brands must diversify their review acquisition methodologies to meet the different expectations of both groups.

5. Younger Americans need extra guidance with the review writing process

Let’s have fun squashing some stereotypes here! It may be a meme to depict young folks as tech-savvy and older folks as behind-the-tech-times, but here’s a lived truth from my own life: my father knows way more about computers than I ever will, and my mother is a much better searcher than I am.

In this data set, we see that the top reason our youngest group doesn’t leave more reviews is because they find the process too confusing and difficult. In other words, they likely require a little extra help and guidance in understanding how to conveniently and efficiently review your local business. Groups B and C already have the review-writing process well in hand, and say that their top blocker to writing more reviews is simply forgetting to do so when they have the free time. For these groups, reminders rather than tutorials are likely to be most effective.

6. The youngest Americans are feeling the burden of bad products

66% of Group B and 76% of Group C say that the top cause of them writing negative reviews is experiencing rude or bad service at a local business. I find it telling and poignant that older Americans have the highest expectations of being treated well by neighborhood companies and are severely let down when owners and staff are unpleasant. Some of us are old enough to remember when nearly all shops were abundantly staffed with well-trained employees who were earning enough of a living wage to have inner funds of contentment and happiness - it’s a far cry from the understaffed warehouses and automated chat bots that too often pass for customer service these days.

However, the data point that interested me most in this set is that our youngest group cites bad products as the top cause of them leaving negative reviews. Your mother-in-law may have had the same washing machine for the last 20 years, but your niece has already had to replace hers twice in the five years since she moved into an apartment with her friends. According to Statista, youngest people are also the poorest, and having to spend what little money they have on shoddy goods is a serious burden, especially when coupled with pandemic-driven supply chain breakages that have made most of us seek out products of indifferent quality because there is no other choice. Local brands should strongly consider overhauling supply chains wherever possible to find higher quality local products to avoid negative reviews and safeguard reputation in the eyes of the rising generation of consumers.

7. Youngest and eldest Americans have more modest expectations of review response times

15% of group B expects to receive an owner response to their review within 2 hours, compared to just 7% of group A and only 1% of group C. 23% of group B expects to hear back with 24 hours, while this figure is at 19% for group A and 18% for group C. 33% of group A expects a response within 24 hours, while 27% is the figure for both B and C. There’s an opportunity here to surpass expectations for all three groups by responding as quickly as possible to reviews, which means paying attention to incoming review alerts and finding time to respond.

8. Older Americans are more forgiving when problems are resolved

67% of group B and 61% of group C will definitely update a negative review and low star rating if an owners response resolves their complaints. This figure drops to just 50% for group A. Perhaps the more lived experience we have, the more aware we become of how easily mistakes happen, and the more readily we recognize and reward efforts to make amends.

9. Younger Americans read a greater number of reviews before deciding a business is worth a try

41% of group A read 10-20 reviews before determining a local business is worth trying, and a similar 37% of group B does the same. But the dominant characteristic of Group C is that 41% of them read just 5-9 reviews before making up their minds. This is open to many interpretations. Perhaps the more experienced we are, the more quickly we can scan a scenario and make a judgment. Or, perhaps the younger we are, the more we count on the process of reading lots of reviews to help us gauge public opinion before making our own decision. In any case, local businesses must be sure that there is plenty of reading material in the form of reviews from both of the younger groups.

10. Eldest Americans place the most trust in the public and the least in brand messaging

A pronounced 74% of group C says it places more trust in what customers say about a local business vs. what that business says about itself. For group A, that figure drops to 61% and group B comes in at 69%. Doubtless, the longer we live, the more experience teaches us the difference between reality and advertising, and it’s important to note that for more than 60% of all three groups, control of brand narrative is now firmly in customers’ hands. This is the best of all arguments for why customer service is the core of the business model – it writes the brand story that the majority of the public believes most.

11. Low stars shed the most trust for eldest Americans

Well over half of group C says that a low star rating compared to local competitors is the top source of lost trust when it comes to local business reviews. Groups A and B put the appearance of a business or its staff self-reviewing as their top cause of lost trust. This dynamic shows how trust can be lost at first glance for our eldest group because stars are immediately visible on review profiles, highlighting how important it is for the cumulative reviews to be speaking well of the business. Meanwhile, groups A and B are more investigative, looking more deeply at reviewers’ profiles for signs of suspicious activity. Brands must be sure to avoid all spammy practices that would rightly give these groups cause to doubt the authenticity of their reputation.

12. Youngest Americans are most put off by argumentative owner responses

When asked which factors of an owner response would make them avoid the business, the top element cited by Group A was the owner arguing with the customer. This highlights the need for deft, accountable responses, even when the business believes the customer is wrong. Meanwhile, about half of Group B cites failure of the owner response to fix a cited problem as the characteristic that would make them avoid a business, and nearly ¾ of Group C say the same. Clearly, the more life experience we have, the more we value brands that are great at solving problems that inevitably arise in the course of normal business operations.

13. Eldest Americans have the most motivation (and justification) for sharing their experience via reviews

They say that wisdom comes with age and I see a confirmation of this in the data that 85% of Group C’s primary motivation for writing reviews is to share their experience with others. For Group B, that number is 72%, and for Group A it is 69%. This puts me in mind of how Civics was a required high school class in my parents’ generation, but I seldom hear it spoken of by people of my age group, and I am not sure what part it plays in current school curriculum. Ideas like valuing the sagacity of elders and freely sharing knowledge for community benefit are excellent standards we should not lose. Local brands are extremely fortunate in having volunteers, both young and old, who are continuously speaking about them in every neighborhood across the country.

In conclusion: be sure everybody is sitting at your table

Image credit: Shanghai 031

Some local offerings are geared towards specific age groups. For example, a senior community club has a particular audience, as does a pediatrician. If your customers and clients are entirely within a narrow age-range, pay particular attention to the review preference differences we saw in today’s column.

However, what will be more common is that a local business with a general audience will be looking at how to increase the engagement of further segments within their community which aren’t yet frequenting the brand. For example, a clothier might want both elder and younger shoppers to know their shop stocks a wide variety of garments for many ages and tastes. It’s in cases like these that knowledge of specific habits and preferences can get the brand closer to having meaningful interactions with a wider audience.

In the digital age, it turns out that your local business reputation is like a very large dining table, and by considering how each of your guests likes to be served, you’ll be sure there’s a seat for everybody. When it comes to age, diversity, equity, and inclusion make for better conversation and better community.

Eager for more insights? Read: The Impact of Local Business Reviews on Consumer Behavior

How to Use Chrome to View a Website as Googlebot

The author's views are entirely their own (excluding the unlikely event of hypnosis) and may not always reflect the views of Moz.