[{"data":1,"prerenderedAt":-1},["ShallowReactive",2],{"blog-domain-authority-just-how-crucial-is-it":3,"latest-blogs-home":120},{"message":4,"data":5},"Blogs retrieved successfully",{"blog":6,"latest_blogs":47},{"id":7,"author_id":8,"title":9,"slug":10,"content":11,"short_summary":12,"featured_image":13,"status":14,"meta_title":9,"meta_description":15,"canonical_url":16,"keywords":16,"blog_type":17,"is_featured":18,"word_count":19,"published_at":20,"created_at":21,"updated_at":22,"deleted_at":16,"author":23,"categories":29},300,3,"Domain Authority: Just How Crucial Is It?","domain-authority-just-how-crucial-is-it","\u003Cp>\u003Cspan style=\"font-size: 11pt; color: rgb(0, 0, 0); font-family: Arial, sans-serif;\">Ask any SEO professional to name one feature that revolutionized search. Some might say AI, others say Google’s Panda and Penguin updates, and they aren’t wrong.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 11pt; color: rgb(0, 0, 0); font-family: Arial, sans-serif;\">But if you ask me, only one comes to mind: PageRank.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 11pt; color: rgb(0, 0, 0); font-family: Arial, sans-serif;\">Ahead of its time, this Google algorithm changed the way search engines rank content by changing the metric. Instead of the number of keyword mentions that can be manipulated too easily, it assessed the number of links and their quality. For a time, SEO strategy and tactics revolved around PageRank and its nuts and bolts.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 11pt; color: rgb(0, 0, 0); font-family: Arial, sans-serif;\">Then, in 2016, Google shut down the toolbar used for gauging a website’s PageRank score. This final nail in the coffin had left the Internet scrambling to find another metric to replace it. Arguably, the most talked-about alternative is then-SEOMoz’s \u003Cem>domain authority\u003C\u002Fem>.\u003C\u002Fspan>\u003C\u002Fp>\u003Cfigure data-type=\"image\" data-align=\"center\" style=\"display: inline-block; max-width: 100%; margin-left: auto; margin-right: auto;\">\u003Cimg class=\"max-w-full h-auto rounded-lg\" src=\"https:\u002F\u002Fwebsite-cdn.nobsmarketplace.com\u002Fuploads\u002Fblog-images\u002Fpagerank-alternative-metrics-domain-authority-domain-rating-authority-score-trust-flow-chart-20260318081644-cclzw6KV.png\" data-align=\"center\" style=\"display: block; margin-left: auto; margin-right: auto;\">\u003Cfigcaption>Data source: Google Trends (data as of writing)\u003C\u002Ffigcaption>\u003C\u002Ffigure>\u003Cp>\u003Cspan style=\"font-size: 11pt; color: rgb(0, 0, 0); font-family: Arial, sans-serif;\">For all the talk about domain authority and how it fits in modern SEO, we seem to forget that metrics are arbitrary. Numbers alone don’t paint the full picture, yet they’re treated along the lines of “if you can’t achieve these figures, your website’s not going to last.”\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 11pt; color: rgb(0, 0, 0); font-family: Arial, sans-serif;\">To that end, let’s talk about domain authority. Is it too important to ignore, or are we giving it too much credence? To start with, how is this metric calculated?\u003C\u002Fspan>\u003C\u002Fp>\u003Ch2>\u003Cspan style=\"font-size: 16pt; color: rgb(0, 0, 0); font-family: Arial, sans-serif;\">\u003Cstrong>Domain Authority Explained\u003C\u002Fstrong>\u003C\u002Fspan>\u003C\u002Fh2>\u003Cp>\u003Cspan style=\"font-size: 11pt; color: rgb(0, 0, 0); font-family: Arial, sans-serif;\">The early 2000s saw the rise of SEO analytics tools, with Moz being founded as SEOmoz in 2004. Its proprietary metric, domain authority, was introduced along with its free browser toolbar that works more or less the same as the now-defunct PageRank toolbar.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 11pt; color: rgb(0, 0, 0); font-family: Arial, sans-serif;\">The specifics of calculating a website’s domain authority are, just like Google’s algorithm, kept under lock and key. However,\u003C\u002Fspan>\u003Ca target=\"_blank\" rel=\"noopener noreferrer nofollow\" class=\"text-primary-blue-600 hover:underline\" href=\"https:\u002F\u002Fmoz.com\u002Flearn\u002Fseo\u002Fdomain-authority\">\u003Cspan style=\"font-size: 11pt; color: rgb(0, 0, 0); font-family: Arial, sans-serif;\"> \u003C\u002Fspan>\u003Cspan style=\"font-size: 11pt; color: rgb(17, 85, 204); font-family: Arial, sans-serif;\">\u003Cu>Moz explains\u003C\u002Fu>\u003C\u002Fspan>\u003C\u002Fa>\u003Cspan style=\"font-size: 11pt; color: rgb(0, 0, 0); font-family: Arial, sans-serif;\"> that it uses a machine learning algorithm to measure how likely a domain or website is to appear in search results. It uses a long list of factors, some of which include:\u003C\u002Fspan>\u003C\u002Fp>\u003Cul>\u003Cli>\u003Cp>\u003Cspan style=\"font-size: 11pt; color: rgb(0, 0, 0); font-family: Arial, sans-serif;\">The number of root domains linking to the website\u003C\u002Fspan>\u003C\u002Fp>\u003C\u002Fli>\u003Cli>\u003Cp>\u003Cspan style=\"font-size: 11pt; color: rgb(0, 0, 0); font-family: Arial, sans-serif;\">The overall quality of those linking root domains\u003C\u002Fspan>\u003C\u002Fp>\u003C\u002Fli>\u003Cli>\u003Cp>\u003Cspan style=\"font-size: 11pt; color: rgb(0, 0, 0); font-family: Arial, sans-serif;\">Other signals based on rankings across search results\u003C\u002Fspan>\u003C\u002Fp>\u003C\u002Fli>\u003C\u002Ful>\u003Cp>\u003Cspan style=\"font-size: 11pt; color: rgb(0, 0, 0); font-family: Arial, sans-serif;\">Google and Moz have made it clear that domain authority is NOT a ranking factor. Besides the metric being proprietary to the latter, its way of crunching numbers differs from that of PageRank. Additionally, it measures a domain's or website’s search performance, whereas PageRank measures that of individual pages.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 11pt; color: rgb(0, 0, 0); font-family: Arial, sans-serif;\">Domain authority (DA) scores range from 0 to 100. As a website’s DA score increases with time and effort, so does the difficulty of raising it much further. Newly-published websites start with a DA score of 1 and can quickly work their way up with enough work.\u003C\u002Fspan>\u003C\u002Fp>\u003Ch2>\u003Cspan style=\"font-size: 16pt; color: rgb(0, 0, 0); font-family: Arial, sans-serif;\">\u003Cstrong>Misunderstandings Abound\u003C\u002Fstrong>\u003C\u002Fspan>\u003C\u002Fh2>\u003Cp>\u003Cspan style=\"font-size: 11pt; color: rgb(0, 0, 0); font-family: Arial, sans-serif;\">Moz emphasizes that domain authority should be used for comparison rather than treated as an absolute figure. But given that we’re here talking about it, that clearly isn’t the case.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 11pt; color: rgb(0, 0, 0); font-family: Arial, sans-serif;\">First, it’s dangerous to assume that good SEO hinges on a single metric. Although that may have been the case with PageRank during its time, we’re long past that because of the shift to quality content. You probably read that Google uses around 200 factors for its algorithm, which it later debunked, so why assume that only one metric matters?\u003C\u002Fspan>\u003C\u002Fp>\u003Cfigure data-type=\"image\" data-align=\"center\" style=\"display: inline-block; max-width: 100%; margin-left: auto; margin-right: auto;\">\u003Cimg class=\"max-w-full h-auto rounded-lg\" src=\"https:\u002F\u002Fwebsite-cdn.nobsmarketplace.com\u002Fuploads\u002Fblog-images\u002Fseo-metrics-dashboard-performance-visibility-user-traffic-conversion-data-flowchart-20260318081742-W2hiXn4z.png\" data-align=\"center\" style=\"display: block; margin-left: auto; margin-right: auto;\">\u003C\u002Ffigure>\u003Cp style=\"text-align: center;\">\u003Cspan style=\"font-size: 11pt; color: rgb(0, 0, 0); font-family: Arial, sans-serif;\">\u003Cem>Source:\u003C\u002Fem>\u003C\u002Fspan>\u003Ca target=\"_blank\" rel=\"noopener noreferrer nofollow\" class=\"text-primary-blue-600 hover:underline\" href=\"https:\u002F\u002Fsearchengineland.com\u002Fguide\u002Fhow-to-interpret-seo-data\">\u003Cspan style=\"font-size: 11pt; color: rgb(0, 0, 0); font-family: Arial, sans-serif;\">\u003Cem> \u003C\u002Fem>\u003C\u002Fspan>\u003Cspan style=\"font-size: 11pt; color: rgb(17, 85, 204); font-family: Arial, sans-serif;\">\u003Cem>\u003Cu>Search Engine Land\u003C\u002Fu>\u003C\u002Fem>\u003C\u002Fspan>\u003C\u002Fa>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 11pt; color: rgb(0, 0, 0); font-family: Arial, sans-serif;\">Domain authority is nowhere on the above list of core SEO metrics stated by professionals. It can’t establish itself as absolute because its competitors have their own metrics, ranging from Ahrefs’ Domain Rating to Majestic’s Trust Flow.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 11pt; color: rgb(0, 0, 0); font-family: Arial, sans-serif;\">It’s highly unlikely that Google will adopt domain authority—or any other third-party metric for that matter—anytime soon. Doing so sets a dangerous precedent, not to mention giving people a peek into its confidential algorithm by adding something that’s public knowledge.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 11pt; color: rgb(0, 0, 0); font-family: Arial, sans-serif;\">But perhaps the most egregious is that the DA score matters little in the bigger SEO picture. Some websites with low DA scores reportedly rank higher than their counterparts with high DA scores. They recognize that better opportunities lie elsewhere, such as building topical authority or optimizing the code.\u003C\u002Fspan>\u003C\u002Fp>\u003Ch2>\u003Cspan style=\"font-size: 16pt; color: rgb(0, 0, 0); font-family: Arial, sans-serif;\">\u003Cstrong>Your DA Score Will Go Up, Anyway\u003C\u002Fstrong>\u003C\u002Fspan>\u003C\u002Fh2>\u003Cp>\u003Cspan style=\"font-size: 11pt; color: rgb(0, 0, 0); font-family: Arial, sans-serif;\">If you’re still hung up on boosting your website’s domain authority, know that it’ll increase once you start doing SEO. As mentioned earlier, the number of backlinks and their quality are key factors in the calculation. It’ll go up without you even noticing.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 11pt; color: rgb(0, 0, 0); font-family: Arial, sans-serif;\">On that note, our good friend,\u003C\u002Fspan>\u003Ca target=\"_blank\" rel=\"noopener noreferrer nofollow\" class=\"text-primary-blue-600 hover:underline\" href=\"https:\u002F\u002Fnobsmarketplace.com\u002Flink-building\">\u003Cspan style=\"font-size: 11pt; color: rgb(0, 0, 0); font-family: Arial, sans-serif;\"> \u003C\u002Fspan>\u003Cspan style=\"font-size: 11pt; color: rgb(17, 85, 204); font-family: Arial, sans-serif;\">\u003Cu>ethical link building\u003C\u002Fu>\u003C\u002Fspan>\u003C\u002Fa>\u003Cspan style=\"font-size: 11pt; color: rgb(0, 0, 0); font-family: Arial, sans-serif;\">, becomes important. A DA score may be the first thing people look at when choosing a source of backlinks, but as we’ve established, Google doesn’t care much for it. As long as the content is relevant and isn’t into sketchy SEO, the source’s DA score hardly matters.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 11pt; color: rgb(0, 0, 0); font-family: Arial, sans-serif;\">You also need to optimize your content to contribute to better rankings.\u003C\u002Fspan>\u003C\u002Fp>\u003Cul>\u003Cli>\u003Cp>\u003Cspan style=\"font-size: 11pt; color: rgb(0, 0, 0); font-family: Arial, sans-serif;\">Create helpful content (e.g., original research, comprehensive guides)\u003C\u002Fspan>\u003C\u002Fp>\u003C\u002Fli>\u003Cli>\u003Cp>\u003Cspan style=\"font-size: 11pt; color: rgb(0, 0, 0); font-family: Arial, sans-serif;\">Search for keywords and form topic clusters to shape the ideal content\u003C\u002Fspan>\u003C\u002Fp>\u003C\u002Fli>\u003Cli>\u003Cp>\u003Cspan style=\"font-size: 11pt; color: rgb(0, 0, 0); font-family: Arial, sans-serif;\">Develop an internal linking strategy to seamlessly bridge all your content\u003C\u002Fspan>\u003C\u002Fp>\u003C\u002Fli>\u003Cli>\u003Cp>\u003Cspan style=\"font-size: 11pt; color: rgb(0, 0, 0); font-family: Arial, sans-serif;\">Scour the website for optimization opportunities, such as broken links\u003C\u002Fspan>\u003C\u002Fp>\u003C\u002Fli>\u003Cli>\u003Cp>\u003Cspan style=\"font-size: 11pt; color: rgb(0, 0, 0); font-family: Arial, sans-serif;\">Build partnerships within your niche or industry to get enough exposure\u003C\u002Fspan>\u003C\u002Fp>\u003C\u002Fli>\u003C\u002Ful>\u003Cp>\u003Cspan style=\"font-size: 11pt; color: rgb(0, 0, 0); font-family: Arial, sans-serif;\">As a side note, know that there are no exact standards in DA scores. You may hear people claim that an average or good DA score ranges from 50 to 60, but there’s no reliable way to confirm this. What counts as “good” varies by niche or industry.\u003C\u002Fspan>\u003C\u002Fp>\u003Ch2>\u003Cspan style=\"font-size: 16pt; color: rgb(0, 0, 0); font-family: Arial, sans-serif;\">\u003Cstrong>Useful But Not Crucial\u003C\u002Fstrong>\u003C\u002Fspan>\u003C\u002Fh2>\u003Cp>\u003Cspan style=\"font-size: 11pt; color: rgb(0, 0, 0); font-family: Arial, sans-serif;\">This isn’t to say that domain authority is useless. Websites need an alternative benchmark of SEO effectiveness now that PageRank scores aren’t a thing anymore. But in the end, it’s just that—a benchmark, not the centerpiece of any SEO strategy.\u003C\u002Fspan>\u003C\u002Fp>","Domain authority gets talked about a lot as a key metric in SEO, with some claiming that SEO is impossible without it. Here's why that's a dangerous line of thinking, and how it should be properly used.","https:\u002F\u002Fwebsite-cdn.nobsmarketplace.com\u002Fuploads\u002Ffeatured-images\u002Fseo-scrabble-tiles-wooden-background-search-engine-optimization-20260318081310-0wPOJ3JL.jpg","published","Domain authority has been the most talked-about metric since the curtains closed on PageRank. But just how crucial is it to today’s SEO?",null,"blog",true,890,"2026-03-18T08:11:16.000000Z","2026-03-18T08:39:19.000000Z","2026-03-19T10:57:17.000000Z",{"id":8,"name":24,"email":25,"about":26,"avatar":27,"created_at":28,"updated_at":28,"deleted_at":16},"Jonas Trinidad","jonas@nobsmarketplace.com","","https:\u002F\u002Fwebsite-cdn.nobsmarketplace.com\u002Fblog-authors\u002F2023\u002F05\u002Fjonas-trinidad.jpg","2025-10-26T11:10:22.000000Z",[30,36,42],{"id":31,"name":32,"slug":33,"created_at":34,"updated_at":34,"deleted_at":16,"pivot":35},13,"SEO Marketing","seo-marketing","2025-10-26T11:10:27.000000Z",{"blog_id":7,"category_id":31},{"id":37,"name":38,"slug":39,"created_at":40,"updated_at":40,"deleted_at":16,"pivot":41},16,"Educative Content","educative-content","2026-02-10T11:18:29.000000Z",{"blog_id":7,"category_id":37},{"id":43,"name":44,"slug":45,"created_at":28,"updated_at":28,"deleted_at":16,"pivot":46},4,"Content Marketing","content-marketing",{"blog_id":7,"category_id":43},[48,66,81,99],{"id":49,"author_id":50,"title":51,"slug":52,"content":53,"short_summary":54,"featured_image":55,"status":14,"meta_title":51,"meta_description":54,"canonical_url":16,"keywords":16,"blog_type":17,"is_featured":56,"word_count":57,"published_at":58,"created_at":59,"updated_at":59,"deleted_at":16,"author":60,"categories":65},319,9,"How to Read Google Search Console Metrics in 2026","how-to-read-search-console-metrics-2026","\u003Ch1>How to Read Google Search Console Metrics in 2026 (And Why Your Impression Data Has Been Wrong for 11 Months)\u003C\u002Fh1>\u003Cp>\u003Cspan>Google Search Console is the closest thing site owners have to a direct line into how Google sees their site. It shows which queries bring up pages in search results, how often those pages appear, how often users click through, and where pages rank. For anyone working in SEO, content strategy, or site management, the Performance report is the first place to look when something changes in organic traffic.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan>But the data in Search Console requires more interpretation than most people give it. The metrics look straightforward on the surface, and that simplicity creates a false sense of precision. Understanding what each metric actually measures, how Google defines it, and where the data can mislead is the difference between making informed decisions and chasing ghosts.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan>That distinction became especially relevant on April 3, 2026, when Google disclosed that a logging error had been inflating impression counts in Search Console since May 13, 2025. Nearly eleven months of overstated impressions, affecting every Performance report during that period. More on that at the end of this post, but the disclosure underscores why understanding what these metrics actually represent is worth the time.\u003C\u002Fspan>\u003C\u002Fp>\u003Ch2>\u003Cspan>Impressions: What Counts as “Seen”\u003C\u002Fspan>\u003C\u002Fh2>\u003Cp>\u003Cspan>An impression in Search Console is recorded when a URL from the site appears in a search result that a user could have seen. The key word is “could have.” The user doesn’t need to scroll down to the result. They don’t need to notice it. If Google’s systems determine the result was present on the page the user loaded, it counts as an impression.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan>For standard web results, that means a page ranking on position 1 of the first results page generates an impression for every search that loads those results. A page ranking on position 8 also generates an impression for that same search, even if the user never scrolled past position 3. A page ranking on position 11 (the second page of results) only generates an impression if the user actually clicks through to page two.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan>For features like AI Overviews, featured snippets, image packs, and knowledge panels, the impression counting works slightly differently depending on whether the content requires user interaction (like expanding a section) to become visible. Google’s documentation covers these edge cases, but the general principle is that an impression means “the result was available to be seen,” not “the user saw it.”\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan>As of June 2025, AI Mode data is merged into Search Console performance totals under the “Web” search type. There’s currently no native way to separate AI Mode impressions from standard organic impressions, which adds another layer of ambiguity to the impression number.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan>Impressions are useful for tracking visibility trends over time, but they should never be treated as a measure of actual human attention. A page can accumulate thousands of impressions without a single person consciously noticing it in the results.\u003C\u002Fspan>\u003C\u002Fp>\u003Ch2>\u003Cspan>Clicks: The Most Reliable Metric in the Report\u003C\u002Fspan>\u003C\u002Fh2>\u003Cp>\u003Cspan>A click in Search Console is recorded when a user selects a result that takes them to a page outside of Google Search. Clicks on ads, clicks that keep the user within Google’s interface (like expanding a “People Also Ask” section), and clicks on AI Overview citations that don’t leave Google are generally not counted as clicks in the Performance report.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan>Clicks are the most trustworthy metric in Search Console because they represent a definitive user action. A user saw the result, decided it was worth visiting, and clicked through to the site. There’s no ambiguity about whether the interaction happened. The Google disclosure about the impression bug explicitly confirmed that clicks were not affected by the logging error, which reinforces clicks as the metric to anchor analysis around when other data is uncertain.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan>One thing to watch: Search Console click data and Google Analytics session data won’t match exactly. Search Console counts clicks on the Google side, while GA4 counts sessions on the site side. Redirects, slow-loading pages, users who click but close the tab before the page loads, and tracking script issues all create gaps between the two numbers. A consistent gap is normal. A sudden widening of the gap is worth investigating.\u003C\u002Fspan>\u003C\u002Fp>\u003Ch2>\u003Cspan>Click-Through Rate: A Calculated Metric, Not a Measured One\u003C\u002Fspan>\u003C\u002Fh2>\u003Cp>\u003Cspan>CTR in Search Console is calculated by dividing clicks by impressions. Because it’s derived from those two inputs, any issue with either metric directly affects CTR. If impressions are inflated (as they were from May 2025 through early 2026), CTR appears artificially low. If impressions are underreported, CTR appears artificially high.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan>CTR is useful for comparing the relative performance of different pages or queries within the same time period, where the measurement conditions are consistent. It’s less reliable for comparing CTR across time periods that span data anomalies, algorithm updates, or changes in SERP layout (like the introduction of new features that change how much of the results page is occupied by non-organic elements).\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan>The SERP landscape in 2026 includes AI Overviews, AI Mode, featured snippets, knowledge panels, People Also Ask, local packs, shopping results, video carousels, and image packs. A query where the organic result sits below an AI Overview and a People Also Ask section will naturally have a lower CTR than the same ranking position on a cleaner SERP, even if nothing about the page or its ranking has changed. CTR trends should always be read alongside an understanding of what the SERP actually looks like for the queries in question.\u003C\u002Fspan>\u003C\u002Fp>\u003Ch2>\u003Cspan>Average Position: A Blended Number\u003C\u002Fspan>\u003C\u002Fh2>\u003Cp>\u003Cspan>Average position in Search Console represents the average ranking of a page across all the queries it appeared for during the selected time period. A page that ranked position 2 for a high-volume query and position 45 for twenty low-volume queries could show an average position of 40+, which would make it look like a poorly performing page even though it ranks well for the query that actually drives traffic.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan>Average position is most useful when filtered to a specific query or a narrow set of related queries, where the blending effect is minimized. As a site-level or page-level aggregate across all queries, it’s often misleading. A page’s average position can improve (the number goes down) while traffic decreases, or worsen (the number goes up) while traffic increases, depending on which queries are entering or leaving the data set.\u003C\u002Fspan>\u003C\u002Fp>\u003Ch2>\u003Cspan>How to Use the Metrics Together\u003C\u002Fspan>\u003C\u002Fh2>\u003Cp>\u003Cspan>The metrics in Search Console work best as a system rather than individually. Each one answers a different question.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan>Impressions answer: how visible is the site for a given set of queries? Clicks answer: how many users actually visited the site from search? CTR answers: of the people who could have seen the result, what percentage clicked? Average position answers: where did the page typically rank for the queries it appeared in?\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan>When impressions increase but clicks stay flat, the site is appearing for more queries (or the same queries more often) but those new appearances aren’t generating traffic. Possible explanations include ranking for queries where the result sits below the fold, appearing in positions where AI Overviews or other features absorb the click, or ranking for queries with informational intent where users get their answer from the SERP without clicking.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan>When clicks increase but impressions stay flat, the site is getting a higher CTR for its existing visibility, which usually means improved rankings for high-intent queries, better meta descriptions or titles earning more clicks at the same position, or queries shifting to SERPs where the organic result is more prominent.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan>When both impressions and clicks drop, either the site has lost rankings, queries the site ranked for have declined in volume, or a SERP layout change has pushed organic results further down. Search Console alone can’t distinguish between these causes, so pairing the data with rank tracking tools and SERP analysis fills the gap.\u003C\u002Fspan>\u003C\u002Fp>\u003Ch2>\u003Cspan>Why Clicks, Not Impressions, Should Lead Reporting\u003C\u002Fspan>\u003C\u002Fh2>\u003Cp>\u003Cspan>For \u003C\u002Fspan>\u003Ca target=\"_blank\" rel=\"noopener noreferrer nofollow\" class=\"text-primary-blue-600 hover:underline\" href=\"https:\u002F\u002Fnobsmarketplace.com\u002Flink-building\">\u003Cspan>link building\u003C\u002Fspan>\u003C\u002Fa>\u003Cspan> and \u003C\u002Fspan>\u003Ca target=\"_blank\" rel=\"noopener noreferrer nofollow\" class=\"text-primary-blue-600 hover:underline\" href=\"https:\u002F\u002Fnobsmarketplace.com\u002Fdigital-pr\">\u003Cspan>digital PR\u003C\u002Fspan>\u003C\u002Fa>\u003Cspan> campaigns where the goal is driving organic traffic growth, clicks are the metric that connects directly to business outcomes. Impressions indicate visibility potential. Clicks indicate actual visits. CTR indicates conversion efficiency from impression to visit. Average position indicates competitive standing.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan>The temptation to lead with impressions is understandable because the numbers are larger and growth looks more dramatic on a chart. But as the eleven-month impression bug demonstrates, impression data carries more measurement risk than click data. Building reporting and strategy around clicks as the primary metric, with impressions and CTR as supporting context, produces more reliable analysis.\u003C\u002Fspan>\u003C\u002Fp>\u003Ch2>\u003Cspan>The Impression Bug: What Happened and What to Do About It\u003C\u002Fspan>\u003C\u002Fh2>\u003Cp>\u003Cspan>On April 3, 2026, Google updated its Data Anomalies in Search Console page with the following disclosure: “A logging error is preventing Search Console from accurately reporting impressions from May 13, 2025 onward. This issue will be resolved over the next few weeks; as a result, you may notice a decrease in impressions in the Search Console Performance report. Clicks and other metrics were not affected by the error, and this issue affected data logging only.”\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan>A Google spokesperson confirmed to Search Engine Land: “We identified a reporting error in Search Console that temporarily led to an over-reporting of impressions from May 13, 2025 onward. Bug fixes are being implemented to ensure accurate reporting.”\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan>Eleven months of inflated impression data across every Search Console property. Any analysis, report, or strategic decision made during that period using impression counts or CTR as inputs was working with inaccurate data. CTR calculations during the affected period would have appeared lower than reality because the denominator (impressions) was artificially large.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan>The impression bug also coincided with Google merging AI Mode data into Search Console totals in June 2025, which means impression trend lines from mid-2025 through early 2026 contain at least two significant data discontinuities. Year-over-year comparisons spanning this period are unreliable for impressions and CTR.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan>What to do now: annotate dashboards and reports to mark May 13, 2025 as a data discontinuity point. Use clicks as the primary metric for evaluating performance during the affected period, since Google confirmed clicks were not impacted. Export historical data before the fix fully propagates to preserve a record of both the pre-correction and post-correction numbers. Once the correction is complete, re-baseline impression data and recalculate CTR using the corrected figures. And resist the urge to interpret the forthcoming impression drop as a performance decline. The impressions are going down because the data is becoming more accurate, not because anything changed about the site’s actual visibility.\u003C\u002Fspan>\u003C\u002Fp>","How to Read Google Search Console Metrics in 2026 (And Why Your Impression Data Has Been Wrong for 11 Months)","https:\u002F\u002Fwebsite-cdn.nobsmarketplace.com\u002Fuploads\u002Ffeatured-images\u002Freading-google-metrics-for-2026-20260407095555-iwmKoJuk.png",false,1723,"2026-04-07T09:54:29.000000Z","2026-04-07T09:56:02.000000Z",{"id":50,"name":61,"email":62,"about":16,"avatar":63,"created_at":64,"updated_at":16,"deleted_at":16},"Rasit Cakir","rasit@nobsmarketplace.com","https:\u002F\u002Fwebsite-cdn.nobsmarketplace.com\u002Frasit.webp","2026-01-26T11:10:22.000000Z",[],{"id":67,"author_id":50,"title":68,"slug":69,"content":70,"short_summary":71,"featured_image":72,"status":14,"meta_title":73,"meta_description":74,"canonical_url":16,"keywords":16,"blog_type":17,"is_featured":56,"word_count":75,"published_at":76,"created_at":77,"updated_at":78,"deleted_at":16,"author":79,"categories":80},318,"XML Sitemaps in 2026","xml-sitemaps-in-2026","\u003Ch1>\u003Cspan style=\"font-size: 16pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">\u003Cstrong>XML Sitemaps in 2026\u003C\u002Fstrong>\u003C\u002Fspan>\u003C\u002Fh1>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">A site owner on Reddit’s r\u002FSEO recently asked whether splitting a sitemap.xml into separate files would hurt SEO performance. The site was ranking in the top 3 for most target searches, and the concern was that restructuring the sitemap could disrupt that. Google’s John Mueller jumped in with a response that laid out several reasons why multiple sitemaps are useful, including a few that most guides don’t cover.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">Mueller’s list of reasons for splitting sitemaps: tracking different kinds of URLs in groups (“product detail page sitemap” vs “product category sitemap,” which you can then monitor with Search Console’s page indexing report), splitting by content freshness (so search engines theoretically don’t need to check the “old” sitemap as often), proactively splitting before hitting the 50,000 URL limit, managing hreflang sitemaps (which can take up significant space), and, as he put it, “my computer did it, I don’t know why.”\u003C\u002Fspan>\u003C\u002Fp>\u003Ch2>\u003Cspan style=\"font-size: 14pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">\u003Cstrong>What an XML Sitemap Actually Does\u003C\u002Fstrong>\u003C\u002Fspan>\u003C\u002Fh2>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">An XML sitemap is a file that lists the URLs on a site that should be discoverable by search engines. It serves as a direct communication channel between a website and search engine crawlers, pointing them to pages that should be crawled and indexed.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">Search engines can discover pages through internal links, external backlinks, and crawling, so a sitemap isn’t strictly required for every site. But for sites with deep page structures, pages with few internal links pointing to them, new sites with limited external backlinks, sites that publish content frequently, or JavaScript-heavy sites where content might not be immediately discoverable through standard crawling, a sitemap removes ambiguity about which pages exist and which ones are important enough to index.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">Google’s documentation specifies two hard limits for a single sitemap file: 50,000 URLs maximum and 50MB uncompressed file size maximum. If either limit is exceeded, the sitemap needs to be split into multiple files. A sitemap index file acts as a master list that points to all the individual sitemap files, and that index file is what gets submitted to Search Console.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">Google ignores the priority and changefreq tags in sitemaps. The loc tag (the URL) and the lastmod tag (last modification date) are the only fields Google actually uses. The lastmod date needs to be accurate and verifiable, meaning it should reflect when the page content actually changed, not an arbitrary refresh date. Google has been clear that faking lastmod dates can backfire by causing the system to distrust those signals for the entire site.\u003C\u002Fspan>\u003C\u002Fp>\u003Ch2>\u003Cspan style=\"font-size: 14pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">\u003Cstrong>Why Multiple Sitemaps Are a Strategic Choice\u003C\u002Fstrong>\u003C\u002Fspan>\u003C\u002Fh2>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">Mueller’s Reddit response outlines reasons that go beyond the 50,000 URL limit, and several of them are worth expanding on because they represent practical benefits most sites don’t take advantage of.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">\u003Cstrong>Tracking different content types separately.\u003C\u002Fstrong> Search Console’s page indexing report shows data per sitemap. If all URLs are in a single file, the indexing report gives one aggregated view. If product pages, category pages, blog posts, and support articles each have their own sitemap, Search Console shows indexing status for each group independently. Spotting problems becomes significantly easier. If 200 product pages suddenly drop out of the index, that shows up immediately in the product sitemap’s report rather than being buried in a combined report where 200 out of 10,000 URLs changing status might not be noticed.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">\u003Cstrong>Splitting by freshness.\u003C\u002Fstrong> Mueller mentioned this with a caveat: “theoretically a search engine might not need to check the ‘old’ sitemap as often; I don’t know if this actually happens tho.” The idea is that separating evergreen content from frequently updated content lets crawlers focus their attention on the sitemap that changes often, rather than rechecking thousands of URLs that haven’t changed. Whether Google actually adjusts crawl frequency based on sitemap-level freshness signals is unconfirmed, but the logic is sound from a crawl efficiency perspective.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">\u003Cstrong>Proactive splitting before hitting limits.\u003C\u002Fstrong> Mueller’s point here is practical: if a site is growing and will eventually cross 50,000 URLs, setting up the split structure now avoids having to urgently reconfigure everything later. Building the infrastructure for multiple sitemaps when a site has 20,000 URLs means the transition to 60,000 is seamless rather than an emergency.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">\u003Cstrong>Hreflang management.\u003C\u002Fstrong> For multilingual sites, hreflang annotations can be managed in the HTML of each page or in the sitemap. For sites with many language\u002Fregion variants, the sitemap approach is often more manageable and less error-prone than maintaining hreflang tags across thousands of page templates. But hreflang annotations can make sitemap files grow quickly since each URL needs to reference every alternate language version. Separate sitemaps for hreflang help keep file sizes under the limits.\u003C\u002Fspan>\u003C\u002Fp>\u003Ch2>\u003Cspan style=\"font-size: 14pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">\u003Cstrong>What Should and Shouldn’t Be in a Sitemap\u003C\u002Fstrong>\u003C\u002Fspan>\u003C\u002Fh2>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">The sitemap should include every page that should be indexed. That means canonical URLs for key pages like service pages, product pages, blog posts, landing pages, and any other content that serves search intent. The URLs listed should be the canonical versions, not duplicates, parameterized variations, or alternate formats.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">Pages with noindex tags should not appear in the sitemap. A sitemap tells search engines “please index these pages,” while noindex says the opposite. Including both on the same URL sends conflicting signals. Similarly, pages blocked by robots.txt shouldn’t be in the sitemap, and URLs that redirect or return error codes should be cleaned out regularly.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">For sites running \u003C\u002Fspan>\u003Ca target=\"_blank\" rel=\"noopener noreferrer nofollow\" class=\"text-primary-blue-600 hover:underline\" href=\"https:\u002F\u002Fnobsmarketplace.com\u002Flink-building\">\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">link building\u003C\u002Fspan>\u003C\u002Fa>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\"> campaigns, the sitemap serves as a quality control layer. Every page that receives backlinks should be in the sitemap with an accurate lastmod date, a clean canonical URL, and no conflicting signals. If a page earning links through \u003C\u002Fspan>\u003Ca target=\"_blank\" rel=\"noopener noreferrer nofollow\" class=\"text-primary-blue-600 hover:underline\" href=\"https:\u002F\u002Fnobsmarketplace.com\u002Fguest-posting\">\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">guest posting\u003C\u002Fspan>\u003C\u002Fa>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\"> placements or \u003C\u002Fspan>\u003Ca target=\"_blank\" rel=\"noopener noreferrer nofollow\" class=\"text-primary-blue-600 hover:underline\" href=\"https:\u002F\u002Fnobsmarketplace.com\u002Fdigital-pr\">\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">digital PR\u003C\u002Fspan>\u003C\u002Fa>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\"> coverage returns a redirect, a noindex, or doesn’t appear in the sitemap at all, the link equity flowing to that page may not translate into the indexing and ranking benefits intended. Verifying that linked-to pages are properly represented in the sitemap is a basic but often overlooked step.\u003C\u002Fspan>\u003C\u002Fp>\u003Ch2>\u003Cspan style=\"font-size: 14pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">\u003Cstrong>How to Structure Multiple Sitemaps\u003C\u002Fstrong>\u003C\u002Fspan>\u003C\u002Fh2>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">The sitemap index file is the organizing layer. It lists all individual sitemap files and is the single file submitted to Search Console. The structure looks like a hierarchy: one index file pointing to multiple sitemap files, each containing a subset of URLs.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">Common approaches to splitting include organizing by content type (products, categories, blog posts, pages), by site section (matching the URL structure), by language or region (for multilingual sites using hreflang), or by update frequency (frequently changing content in one sitemap, stable content in another).\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">The sitemap index file itself has the same 50,000 URL limit, meaning it can reference up to 50,000 individual sitemap files. For the vast majority of sites, that ceiling is effectively unlimited. The referenced sitemaps must be hosted on the same site and in the same directory or lower in the site hierarchy as the index file, unless cross-site submission is configured.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">For most CMS platforms, sitemap generation is handled automatically. WordPress plugins like Yoast SEO split sitemaps by content type by default. Other platforms may generate a single sitemap that needs to be manually split as the site grows. Custom-built sites can use server-side scripts or cron jobs to generate and update sitemaps on a schedule, which is the approach the original Reddit poster was describing.\u003C\u002Fspan>\u003C\u002Fp>\u003Ch2>\u003Cspan style=\"font-size: 14pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">\u003Cstrong>Sitemap Maintenance\u003C\u002Fstrong>\u003C\u002Fspan>\u003C\u002Fh2>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">A sitemap that isn’t maintained creates more problems than no sitemap at all. Stale sitemaps with broken URLs, removed pages, or inaccurate lastmod dates waste crawl budget and send misleading signals about the site’s structure.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">The core maintenance tasks are straightforward: remove URLs that return 404 or redirect, update lastmod dates only when content actually changes, add new pages as they’re published, remove pages that are set to noindex, and verify that every listed URL resolves to a 200 status code with the correct canonical tag.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">Google Search Console’s sitemap report and page indexing report are the primary monitoring tools. They show how many URLs were submitted, how many are indexed, and where errors are occurring. Checking these reports regularly, especially after site changes, content migrations, or URL structure updates, catches problems before they affect visibility.\u003C\u002Fspan>\u003C\u002Fp>\u003Ch2>\u003Cspan style=\"font-size: 14pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">\u003Cstrong>The Bottom Line on Splitting\u003C\u002Fstrong>\u003C\u002Fspan>\u003C\u002Fh2>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">Mueller’s response on Reddit confirms what experienced technical SEOs have known but rarely see documented from Google’s side: splitting sitemaps is a management and monitoring strategy, not just a response to hitting size limits. The strategic benefits of tracking different content types independently in Search Console, separating evergreen from frequently updated content, planning for growth, and managing hreflang complexity all make multiple sitemaps a better default than a single monolithic file for any site with meaningful scale or growth ambitions.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">Splitting a sitemap won’t hurt SEO. Google processes sitemap index files and individual sitemaps the same way regardless of how many files are involved. The URLs are what matter, not how they’re organized across files. The organization serves the site owner’s ability to monitor and maintain the sitemap, not the search engine’s ability to read it.\u003C\u002Fspan>\u003C\u002Fp>","XML Sitemaps in 2026: When and Why to Split Them, and What John Mueller Says About Multiple Sitemaps","https:\u002F\u002Fwebsite-cdn.nobsmarketplace.com\u002Fuploads\u002Ffeatured-images\u002Fxml-sitemaps-in-2026-infographic-20260406075705-IUhkrigN.png","XML Sitemaps in 2026 : Practical Information","XML Sitemaps in 2026: When and Why to Split Them, and What Mueller Says About Multiple Sitemaps",1445,"2026-04-06T07:45:12.000000Z","2026-04-06T07:56:14.000000Z","2026-04-06T07:57:11.000000Z",{"id":50,"name":61,"email":62,"about":16,"avatar":63,"created_at":64,"updated_at":16,"deleted_at":16},[],{"id":82,"author_id":50,"title":83,"slug":84,"content":85,"short_summary":86,"featured_image":87,"status":14,"meta_title":88,"meta_description":86,"canonical_url":16,"keywords":16,"blog_type":17,"is_featured":56,"word_count":89,"published_at":90,"created_at":91,"updated_at":92,"deleted_at":16,"author":93,"categories":94},317,"Five Years of Google Core Updates and What Mueller Revealed About How They Roll Out","five-years-of-google-core-updates-and-what-mueller-revealed-about-how-they-roll-out","\u003Ch1>\u003Cspan style=\"font-size: 16pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">\u003Cstrong>Five Years of Google Core Updates and What Mueller Just Revealed About How They Actually Roll Out\u003C\u002Fstrong>\u003C\u002Fspan>\u003C\u002Fh1>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Cambria, serif;\">Google’s John Mueller responded to a question on Bluesky on March 31, 2026, about whether core updates roll out in stages or follow a fixed sequence. The answer he gave is one of the clearest explanations of how core updates actually work that Google has shared publicly, and it reframes how the SEO industry should think about the volatility waves that show up during every rollout.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Cambria, serif;\">Mueller’s response: “We generally don’t announce ‘stages’ of core updates. Since these are significant, broad changes to our search algorithms and systems, sometimes they have to work step-by-step, rather than all at one time. (It’s also why they can take a while to be fully live.)”\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Cambria, serif;\">He followed up with a second post that went further: “I guess in short there’s not a single ‘core update machine’ that’s clicked on (every update has the same flow), but rather we make the changes based on what the teams have been working on, and those systems &amp; components can change from time to time.”\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Cambria, serif;\">Two things stand out from that exchange. First, core updates are not a single switch being flipped. They’re a collection of changes across multiple systems, deployed incrementally. Second, the composition of a core update varies from one release to the next. The systems and components involved aren’t fixed. Different teams contribute different changes depending on what they’ve been working on, which means no two core updates are structurally identical even if they carry the same “core update” label.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Cambria, serif;\">That context makes the last five years of core update history easier to read. The variation in rollout duration, in which industries get hit, and in how recovery behaves across different updates all make more sense when the update itself is understood as a variable collection of component changes rather than a single algorithmic adjustment.\u003C\u002Fspan>\u003C\u002Fp>\u003Ch2>\u003Cspan style=\"font-size: 14pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">\u003Cstrong>The Core Update Timeline: 2021 to 2026\u003C\u002Fstrong>\u003C\u002Fspan>\u003C\u002Fh2>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Cambria, serif;\">Google has released 17 core updates since June 2021. The pace has been fairly consistent at three to four per year, though the character of these updates has shifted significantly over time.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Cambria, serif;\">\u003Cstrong>2021\u003C\u002Fstrong> saw three core updates. June (June 2 to June 12), July (July 1 to July 12), and November (November 17 to November 30). The June and July updates were unusual because Google explicitly announced them as two parts of a broader change, with the July update completing work that began in June. Rollouts were relatively short, ranging from 10 to 14 days. The updates focused primarily on content relevance and quality signals as Google continued refining the systems that had been in development since the 2019 BERT update.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Cambria, serif;\">\u003Cstrong>2022\u003C\u002Fstrong> brought two core updates. May (May 25 to June 9) and September (September 12 to September 26). But 2022’s bigger story was the launch of the Helpful Content Update in August, which introduced a site-wide signal designed to penalize content created primarily for search engine rankings rather than human readers. The Helpful Content system operated as a separate signal from the core algorithm, applying a domain-level penalty that could drag down rankings for an entire site if a significant portion of its content was deemed unhelpful. A second Helpful Content Update followed in December 2022.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Cambria, serif;\">\u003Cstrong>2023\u003C\u002Fstrong> was Google’s busiest year for core updates, with three: March (March 15 to March 28), August (August 22 to September 7), and October (October 5 to October 19), plus a November core update (November 2 to November 28) that ran nearly four weeks. The September 2023 Helpful Content Update hit particularly hard and became one of the most discussed updates in recent SEO history. Many sites that lost visibility in September 2023 spent the next year waiting for recovery that, in some cases, never came. Affiliate sites, product review sites, and content-heavy publishers were the most affected categories.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Cambria, serif;\">\u003Cstrong>2024\u003C\u002Fstrong> started with the March 2024 core update, which turned out to be the most significant algorithmic change in years. It ran for 45 days, the longest core update rollout on record, and it did two things that changed the landscape permanently. First, Google absorbed the Helpful Content system into the core algorithm. The separate signal that had been running since August 2022 was folded into how Google evaluates every query, which meant helpfulness assessment shifted from a standalone system to a component of core ranking. Second, Google moved from site-level helpfulness evaluation to page-level evaluation, using a combination of signals rather than a single sitewide penalty score. Google’s stated goal was to reduce low-quality, unoriginal content in search results by 40%. Three new spam policies launched simultaneously: expired domain abuse, scaled content abuse, and site reputation abuse. The August update (August 15 to September 3) and two more core updates in November (November 11 to December 5) and December (December 12 to December 18) followed. The December update was notable for its speed, completing in just six days.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Cambria, serif;\">\u003Cstrong>2025\u003C\u002Fstrong> brought three core updates and marked the year where AI-related content quality became a central focus. The March update (March 13 to March 27) put continued pressure on AI-generated content that lacked original analysis or first-hand experience. The June update (June 30 to July 17) appeared to weigh off-page factors more heavily, with link quality, brand authority, and topical relevance of referring domains playing a larger role. More than 50% of sites that had been affected by the September 2023 Helpful Content Update saw improvements during the June 2025 update, suggesting that the page-level evaluation introduced in March 2024 was finally catching up to sites that had fixed their content quality issues. The December update (December 11 to December 29) expanded E-E-A-T requirements beyond traditional YMYL categories into virtually all competitive queries.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Cambria, serif;\">\u003Cstrong>2026\u003C\u002Fstrong> has already seen the February Discover core update (the first core update specific to the Discover feed rather than general search), the March 2026 spam update, and the March 2026 core update, which began rolling out on March 27.\u003C\u002Fspan>\u003C\u002Fp>\u003Ch2>\u003Cspan style=\"font-size: 14pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">\u003Cstrong>What Changed Between 2021 and 2026\u003C\u002Fstrong>\u003C\u002Fspan>\u003C\u002Fh2>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Cambria, serif;\">Looking at five years of updates in sequence, a few shifts stand out.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Cambria, serif;\">The Helpful Content integration was the single biggest structural change. What started as a separate system in August 2022 became part of core ranking in March 2024, and the shift from site-level to page-level evaluation changed how recovery works. Before March 2024, a site penalized by the Helpful Content signal needed to improve its overall content quality across the domain. After March 2024, individual pages are evaluated independently, which means a site can have some pages performing well while others are still suppressed.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Cambria, serif;\">The pace of updates has stayed consistent, but the composition has become more complex. Mueller’s Bluesky explanation confirms what practitioners have observed: each core update touches different systems depending on what Google’s teams have been working on. The March 2024 update took 45 days because it combined multiple major changes (Helpful Content integration, page-level evaluation, new spam policies). The December 2024 update completed in six days, likely because it involved fewer component changes. Rollout duration is a rough proxy for how many systems are being updated simultaneously.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Cambria, serif;\">Recovery patterns have become more gradual. In the earlier core updates (2021-2022), ranking changes tended to settle within the rollout period. In recent updates, particularly after the March 2024 changes, recovery from previous updates has appeared in later core updates rather than within the same cycle. Google has stated that some changes may take months to be reassessed, and some effects require waiting for the next update cycle.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Cambria, serif;\">Off-page signals have gained weight over time. The June 2025 core update was notable for how heavily link quality, brand authority, and referring domain relevance appeared to influence outcomes. Combined with the growing role of backlink profiles in AI citation data (SE Ranking found that sites with 32,000+ referring domains are 3.5x more likely to be cited by ChatGPT), the signal is consistent: \u003C\u002Fspan>\u003Ca target=\"_blank\" rel=\"noopener noreferrer nofollow\" class=\"text-primary-blue-600 hover:underline\" href=\"https:\u002F\u002Fnobsmarketplace.com\u002Flink-building\">\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Cambria, serif;\">link building\u003C\u002Fspan>\u003C\u002Fa>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Cambria, serif;\"> from relevant, authoritative sources feeds both traditional search rankings and the newer AI visibility layer.\u003C\u002Fspan>\u003C\u002Fp>\u003Ch2>\u003Cspan style=\"font-size: 14pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">\u003Cstrong>Mueller’s Explanation and What It Means for Reading Core Updates\u003C\u002Fstrong>\u003C\u002Fspan>\u003C\u002Fh2>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Cambria, serif;\">Mueller’s description of core updates as collections of team-driven component changes rather than a single algorithmic switch explains several patterns that have puzzled practitioners.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Cambria, serif;\">It explains why volatility during a rollout comes in waves rather than all at once. If different components go live at different points during the rollout window, rankings can shift, settle, shift again, and settle again as each component takes effect. The waves of volatility that practitioners observe during the typical 2-3 week rollout period likely correspond to different system components going live at different times.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Cambria, serif;\">It explains why different industries get affected at different points during the same update. If one component targets content quality signals and another targets link evaluation, the industry impacts won’t be simultaneous. Sites heavily dependent on link signals might see movement early while content-driven shifts appear later, or vice versa, depending on when each component deploys.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Cambria, serif;\">It explains why recovery sometimes happens mid-rollout. If a site was suppressed by a component that goes live early in the rollout, and a later component reevaluates the same signals more favorably, the site could see partial recovery before the update officially completes.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Cambria, serif;\">And it explains why Google says to wait until the rollout is fully complete before drawing conclusions. If the update is a sequence of component deployments rather than a single change, any assessment made mid-rollout is based on an incomplete picture. The final state after all components have deployed may look different from the state at any individual point during the rollout.\u003C\u002Fspan>\u003C\u002Fp>\u003Ch2>\u003Cspan style=\"font-size: 14pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">\u003Cstrong>What to Do During and After a Core Update\u003C\u002Fstrong>\u003C\u002Fspan>\u003C\u002Fh2>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Cambria, serif;\">The guidance hasn’t changed much over five years, which is itself informative: the principles Google rewards have been consistent even as the systems evaluating them have evolved.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Cambria, serif;\">Focus on content quality, depth, originality, and demonstrated expertise. The Helpful Content integration into core ranking made these signals central to how Google evaluates every page. Content created primarily to rank rather than to genuinely help users has been consistently penalized across every major update since 2022.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Cambria, serif;\">Build authority through editorial relationships and earned coverage. The increasing weight of off-page signals in recent core updates, combined with the growing importance of third-party presence for AI visibility, makes \u003C\u002Fspan>\u003Ca target=\"_blank\" rel=\"noopener noreferrer nofollow\" class=\"text-primary-blue-600 hover:underline\" href=\"https:\u002F\u002Fnobsmarketplace.com\u002Fdigital-pr\">\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Cambria, serif;\">digital PR\u003C\u002Fspan>\u003C\u002Fa>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Cambria, serif;\"> and \u003C\u002Fspan>\u003Ca target=\"_blank\" rel=\"noopener noreferrer nofollow\" class=\"text-primary-blue-600 hover:underline\" href=\"https:\u002F\u002Fnobsmarketplace.com\u002Fguest-posting\">\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Cambria, serif;\">guest posting\u003C\u002Fspan>\u003C\u002Fa>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Cambria, serif;\"> on relevant industry publications a dual-purpose investment that serves both traditional rankings and AI citation systems.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Cambria, serif;\">Don’t make reactive changes during a rollout. Mueller’s explanation confirms that the ranking state mid-rollout is incomplete. Wait for the rollout to finish, then evaluate whether changes are needed based on the settled state rather than the intermediate fluctuations.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Cambria, serif;\">Track performance across update cycles, not just within them. Recovery from one core update may appear in a subsequent core update months later. A site that lost visibility in March may not see recovery until June or later, and that timeline is normal rather than a sign of permanent penalty.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Cambria, serif;\">And monitor what Google communicates about each update. Google doesn’t always disclose what a core update targets, but when they do provide guidance (as they did extensively with the March 2024 update), that guidance tends to remain relevant across subsequent updates in the same evolutionary direction.\u003C\u002Fspan>\u003C\u002Fp>","Documenting the last 5 years Google Core Updates and showing lessons to take","https:\u002F\u002Fwebsite-cdn.nobsmarketplace.com\u002Fuploads\u002Ffeatured-images\u002Fgoogle-core-update-timeline-and-tools-20260403072610-YxbHZxYr.png","Five Years of Google Core Updates and What Mueller Revealed",1828,"2026-04-03T07:11:57.000000Z","2026-04-03T07:22:11.000000Z","2026-04-03T07:27:28.000000Z",{"id":50,"name":61,"email":62,"about":16,"avatar":63,"created_at":64,"updated_at":16,"deleted_at":16},[95],{"id":8,"name":96,"slug":97,"created_at":28,"updated_at":28,"deleted_at":16,"pivot":98},"SEO","seo",{"blog_id":82,"category_id":8},{"id":100,"author_id":50,"title":101,"slug":102,"content":103,"short_summary":104,"featured_image":105,"status":14,"meta_title":106,"meta_description":107,"canonical_url":16,"keywords":16,"blog_type":17,"is_featured":18,"word_count":108,"published_at":109,"created_at":110,"updated_at":111,"deleted_at":16,"author":112,"categories":113},316,"AI Visibility in 2026: What Actually Gets Brands Cited by LLMs","ai-visibility-2026-what-gets-brands-cited","\u003Ch1>\u003Cspan style=\"font-size: 16pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">\u003Cstrong>AI Visibility in 2026: What Actually Gets Brands Cited by LLMs\u003C\u002Fstrong>\u003C\u002Fspan>\u003C\u002Fh1>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">A year ago, AI visibility was a concept most marketers were still treating as theoretical. By March 2026, it’s become measurable, trackable, and consequential enough that brands are losing and gaining market share based on whether AI systems recommend them.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">The change happened faster than most of the industry expected. SE Ranking’s data shows AI platforms now account for 0.24% of global internet traffic, up 1.6x from 2025. This percentage still sounds small until you consider what those visits represent: high-intent users getting direct recommendations from AI systems that have already decided which brands to surface. There’s no scrolling through results. No clicking across tabs to compare. The AI picks, and the user follows.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">The question every brand operating in search should be asking is straightforward: what determines whether an AI system cites you or your competitor?\u003C\u002Fspan>\u003C\u002Fp>\u003Ch2>\u003Cspan style=\"font-size: 14pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">\u003Cstrong>The Sources AI Systems Actually Pull From\u003C\u002Fstrong>\u003C\u002Fspan>\u003C\u002Fh2>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">Peec AI published an analysis of 30 million sources across five major AI platforms (ChatGPT, Google AI Mode, Gemini, Perplexity, and AI Overviews) to identify which domains get cited most frequently. The top 10 most-cited domains across all platforms: Reddit, YouTube, LinkedIn, Wikipedia, Forbes, Facebook, Yelp, Amazon, TechRadar, and Healthline.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">The list is revealing for what it says about how AI systems evaluate trustworthiness. The top sources aren’t all traditional media outlets or high-authority publications. They’re a mix of user-generated discussion platforms (Reddit), video content (YouTube), professional networks (LinkedIn), reference sites (Wikipedia), editorial publications (Forbes, TechRadar, Healthline), and commercial platforms with review ecosystems (Amazon, Yelp).\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">What connects them is that each platform provides a type of information that AI models find useful for building confident answers: real user experiences on Reddit, visual demonstrations on YouTube, professional credibility signals on LinkedIn, factual grounding on Wikipedia, editorial validation from established publications, and crowd-sourced ratings on review platforms.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">The more interesting finding from the Peec AI analysis is how the source preferences diverge across platforms. Reddit and YouTube appear across all five AI systems, which explains their top-line dominance. But beyond those two, each platform has its own preferences. ChatGPT leans toward Wikipedia and editorial sources like Forbes and TechRadar. Google’s AI Mode and AI Overviews favor social content and local review platforms like Facebook and Yelp. Perplexity emphasizes Reddit, LinkedIn, and B2B review platforms like G2.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">The divergence matters because it means AI visibility isn’t one thing. A brand that’s well-cited in ChatGPT might be invisible in Google’s AI Mode, and vice versa. The Writesonic study covered in a previous NO-BS blog post showed only 7% citation overlap between ChatGPT’s default and premium models. The Peec AI data suggests the divergence extends across platforms, not just within them.\u003C\u002Fspan>\u003C\u002Fp>\u003Ch2>\u003Cspan style=\"font-size: 14pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">\u003Cstrong>What’s Changed About How AI Visibility Works\u003C\u002Fstrong>\u003C\u002Fspan>\u003C\u002Fh2>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">A year ago, the working assumption in SEO was that traditional ranking signals would translate fairly directly into AI citations. If a page ranked well on Google, AI systems would probably cite it too. That assumption has turned out to be partially true and partially misleading.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">Ahrefs’ data from late 2025 showed that AI Overviews have the strongest correlation with traditional search rankings among all AI platforms. For Google’s own AI features, the connection between organic ranking and AI citation is real. But for ChatGPT, Perplexity, and other non-Google AI systems, the relationship is weaker. The Writesonic study found that 75% of domains cited by ChatGPT’s premium model don’t appear on Google or Bing at all. ChatGPT identifies brands from training data and queries their sites directly rather than pulling from search rankings.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">SE Ranking’s research from November 2025 added another dimension: domain authority and third-party presence matter significantly. Sites with over 32,000 referring domains are 3.5x more likely to be cited by ChatGPT than those with fewer than 200. Domains with active profiles on review platforms like Trustpilot, G2, Capterra, and Yelp have 3x higher chances of being cited. And domains with millions of brand mentions on Reddit and Quora have roughly 4x higher citation rates than those with minimal activity on those platforms.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">The pattern is consistent: AI systems don’t just look at the brand’s own website. They look at how the brand shows up across the web. The ecosystem of third-party mentions, reviews, discussions, and editorial coverage surrounding a brand is as important as the brand’s own content.\u003C\u002Fspan>\u003C\u002Fp>\u003Ch2>\u003Cspan style=\"font-size: 14pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">\u003Cstrong>The Two-Layer Visibility Problem\u003C\u002Fstrong>\u003C\u002Fspan>\u003C\u002Fh2>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">The landscape in 2026 gets interesting when you look at where brands are actually investing. AI visibility operates on two layers simultaneously, and most brands are only working on one of them.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">The first layer is on-site: the brand’s own website content. Content structure, clarity, depth, freshness, and technical accessibility all influence whether an AI system can retrieve and use the content effectively. Research from Growth Memo found that 44.2% of all LLM citations come from the first 30% of a page’s text, which means content structure and front-loading key information directly affects citation probability. AirOps found that ChatGPT only cites 15% of the pages it retrieves, meaning 85% of content that gets pulled into the model’s processing never makes it into the final answer.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">The second layer is off-site: how the brand appears across the third-party sources that AI systems trust. Reddit threads, YouTube videos, LinkedIn posts, Wikipedia references, review platform profiles, editorial coverage in industry publications, and brand mentions in forums and communities. This is the layer the Peec AI data highlights. The most-cited domains in AI search are overwhelmingly third-party platforms, not brand-owned websites. The brands getting cited are the ones showing up consistently across these external sources.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">Most brands have invested heavily in the first layer (their own content) while underinvesting in the second layer (their presence across the third-party sources AI actually prefers). The Peec AI data suggests that rebalancing that investment is one of the highest-leverage moves available.\u003C\u002Fspan>\u003C\u002Fp>\u003Ch2>\u003Cspan style=\"font-size: 14pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">\u003Cstrong>Where Link Building and Digital PR Fit\u003C\u002Fstrong>\u003C\u002Fspan>\u003C\u002Fh2>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">The connection between traditional \u003C\u002Fspan>\u003Ca target=\"_blank\" rel=\"noopener noreferrer nofollow\" class=\"text-primary-blue-600 hover:underline\" href=\"https:\u002F\u002Fnobsmarketplace.com\u002Flink-building\">\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">link building\u003C\u002Fspan>\u003C\u002Fa>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\"> and AI visibility is becoming clearer with every new study. The SE Ranking finding that sites with 32,000+ referring domains are 3.5x more likely to be cited by ChatGPT points directly at backlink profiles as an AI visibility signal. The Peec AI data showing editorial publications like Forbes and TechRadar among the top cited domains reinforces the value of \u003C\u002Fspan>\u003Ca target=\"_blank\" rel=\"noopener noreferrer nofollow\" class=\"text-primary-blue-600 hover:underline\" href=\"https:\u002F\u002Fnobsmarketplace.com\u002Fdigital-pr\">\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">digital PR\u003C\u002Fspan>\u003C\u002Fa>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\"> placements on authoritative sites.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">But the data also suggests that the type of link building matters more than it used to. Getting a backlink on a high-DA site that AI systems don’t cite doesn’t help AI visibility. Getting a mention or placement on a site that AI systems actively pull from (Reddit, LinkedIn, YouTube, G2, industry-specific publications) does. \u003C\u002Fspan>\u003Ca target=\"_blank\" rel=\"noopener noreferrer nofollow\" class=\"text-primary-blue-600 hover:underline\" href=\"https:\u002F\u002Fnobsmarketplace.com\u002Fguest-posting\">\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">Guest posting\u003C\u002Fspan>\u003C\u002Fa>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\"> on the publications that show up in AI citation data, earning editorial coverage through digital PR campaigns that land on sites AI trusts, and building a presence on the review platforms and discussion forums that LLMs retrieve from are all direct inputs to AI citation probability.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">The Stacker research from December 2025 quantified part of this: distributing content to a wide range of publications can increase AI citations by up to 325% compared to publishing only on your own site. Earned media coverage across trusted third-party sources feeds the second visibility layer that most brands are underinvesting in.\u003C\u002Fspan>\u003C\u002Fp>\u003Ch2>\u003Cspan style=\"font-size: 14pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">\u003Cstrong>The Multi-Platform Reality\u003C\u002Fstrong>\u003C\u002Fspan>\u003C\u002Fh2>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">The SE Ranking AI traffic data from a recent NO-BS blog post showed Gemini’s referral traffic growing at 47% per month while ChatGPT’s declined at 8% per month. The Peec AI data shows different platforms citing different sources. The Writesonic study showed different models within the same platform citing almost entirely different sources.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">The combined picture in March 2026 is that AI visibility has become a multi-platform, multi-model challenge where no single optimization approach covers everything. The brands building AI visibility that holds up across platforms are the ones investing in the foundation that all AI systems draw from: authoritative backlinks, consistent brand presence across trusted third-party platforms, strong review profiles, and content that’s structured for AI retrieval.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">The platform-specific rankings will keep shifting. What gets a brand cited in Perplexity today may differ from what gets it cited in Gemini next quarter. The authority foundation underneath is what stays constant.\u003C\u002Fspan>\u003C\u002Fp>\u003Cp>\u003Cspan style=\"font-size: 12pt; color: rgb(0, 0, 0); font-family: Calibri, sans-serif;\">Peec AI study: analysis of 30 million sources across ChatGPT, Google AI Mode, Gemini, Perplexity, and AI Overviews, March 2026.\u003C\u002Fspan>\u003C\u002Fp>","How LLM tools cite brands? Answer is a bit complex, but digital PR and high authority seem to lead the way","https:\u002F\u002Fwebsite-cdn.nobsmarketplace.com\u002Fuploads\u002Ffeatured-images\u002Fimage-apr-2-2026-09-48-17-am-20260402074850-MmACyW63.png","AI Visibility in 2026: How Brands Cited by LLM Tools","Are you curious about how to get cited by AI tools in 2026? Answer is data-based blog post. Take a look!",1345,"2026-04-02T07:37:11.000000Z","2026-04-02T07:51:23.000000Z","2026-04-03T07:27:39.000000Z",{"id":50,"name":61,"email":62,"about":16,"avatar":63,"created_at":64,"updated_at":16,"deleted_at":16},[114],{"id":115,"name":116,"slug":117,"created_at":118,"updated_at":118,"deleted_at":16,"pivot":119},23,"AI","ai","2026-03-10T11:18:29.000000Z",{"blog_id":100,"category_id":115},[121,126,154],{"id":100,"author_id":50,"title":101,"slug":102,"featured_image":105,"published_at":109,"short_summary":104,"word_count":108,"author":122,"categories":123},{"id":50,"name":61,"avatar":63,"email":62},[124],{"id":115,"name":116,"pivot":125},{"blog_id":100,"category_id":115},{"id":127,"author_id":50,"title":128,"slug":129,"featured_image":130,"published_at":131,"short_summary":132,"word_count":133,"author":134,"categories":135},314,"The “Global Spanish” Problem in AI Search: Why LLMs Can’t Tell Spanish-Speaking Markets Apart","global-spanish-problem-ai-search-visibility","https:\u002F\u002Fwebsite-cdn.nobsmarketplace.com\u002Fuploads\u002Ffeatured-images\u002Fjorono-international-2681322-1280-20260401111159-DmjGdnk0.jpg","2026-04-01T19:30:00.000000Z","AI search seems to be struggling to understand queries made in Spanish, mainly because the language isn't just spoken in Spain. Instead, it tends to return information from sources from other Spanish-speaking countries--including the United States, where Spanish is spoken by over a tenth of the population. What's a Spanish-speaking website to do amid all this?",1495,{"id":50,"name":61,"avatar":63,"email":62},[136,140,144,146,148,152],{"id":137,"name":138,"pivot":139},1,"Blogs",{"blog_id":127,"category_id":137},{"id":141,"name":142,"pivot":143},2,"Digital Marketing",{"blog_id":127,"category_id":141},{"id":8,"name":96,"pivot":145},{"blog_id":127,"category_id":8},{"id":43,"name":44,"pivot":147},{"blog_id":127,"category_id":43},{"id":149,"name":150,"pivot":151},15,"Industry News",{"blog_id":127,"category_id":149},{"id":115,"name":116,"pivot":153},{"blog_id":127,"category_id":115},{"id":155,"author_id":50,"title":156,"slug":157,"featured_image":158,"published_at":159,"short_summary":160,"word_count":161,"author":162,"categories":163},311,"Websites Are Getting 2x More AI Traffic from Gemini Than Five Months Ago. ChatGPT Is Declining.","gemini-vs-chatgpt-ai-traffic-trends","https:\u002F\u002Fwebsite-cdn.nobsmarketplace.com\u002Fuploads\u002Ffeatured-images\u002Fgradientarc-ai-generated-8942974-1280-20260331031020-rnI2qMeR.jpg","2026-03-31T11:18:00.000000Z","Despite still dominating the market, ChatGPT is under threat from Gemini after a recent study found that Gemini drives twice as much traffic to websites. And to think that Google's prized model finished the previous year with a relatively weak turnout.",1123,{"id":50,"name":61,"avatar":63,"email":62},[164,166,168,170,172],{"id":137,"name":138,"pivot":165},{"blog_id":155,"category_id":137},{"id":8,"name":96,"pivot":167},{"blog_id":155,"category_id":8},{"id":43,"name":44,"pivot":169},{"blog_id":155,"category_id":43},{"id":149,"name":150,"pivot":171},{"blog_id":155,"category_id":149},{"id":115,"name":116,"pivot":173},{"blog_id":155,"category_id":115}]