How to Read Google Search Console Metrics in 2026 (And Why Your Impression Data Has Been Wrong for 11 Months)
Google Search Console is the closest thing site owners have to a direct line into how Google sees their site. It shows which queries bring up pages in search results, how often those pages appear, how often users click through, and where pages rank. For anyone working in SEO, content strategy, or site management, the Performance report is the first place to look when something changes in organic traffic.
But the data in Search Console requires more interpretation than most people give it. The metrics look straightforward on the surface, and that simplicity creates a false sense of precision. Understanding what each metric actually measures, how Google defines it, and where the data can mislead is the difference between making informed decisions and chasing ghosts.
That distinction became especially relevant on April 3, 2026, when Google disclosed that a logging error had been inflating impression counts in Search Console since May 13, 2025. Nearly eleven months of overstated impressions, affecting every Performance report during that period. More on that at the end of this post, but the disclosure underscores why understanding what these metrics actually represent is worth the time.
Impressions: What Counts as “Seen”
An impression in Search Console is recorded when a URL from the site appears in a search result that a user could have seen. The key word is “could have.” The user doesn’t need to scroll down to the result. They don’t need to notice it. If Google’s systems determine the result was present on the page the user loaded, it counts as an impression.
For standard web results, that means a page ranking on position 1 of the first results page generates an impression for every search that loads those results. A page ranking on position 8 also generates an impression for that same search, even if the user never scrolled past position 3. A page ranking on position 11 (the second page of results) only generates an impression if the user actually clicks through to page two.
For features like AI Overviews, featured snippets, image packs, and knowledge panels, the impression counting works slightly differently depending on whether the content requires user interaction (like expanding a section) to become visible. Google’s documentation covers these edge cases, but the general principle is that an impression means “the result was available to be seen,” not “the user saw it.”
As of June 2025, AI Mode data is merged into Search Console performance totals under the “Web” search type. There’s currently no native way to separate AI Mode impressions from standard organic impressions, which adds another layer of ambiguity to the impression number.
Impressions are useful for tracking visibility trends over time, but they should never be treated as a measure of actual human attention. A page can accumulate thousands of impressions without a single person consciously noticing it in the results.
Clicks: The Most Reliable Metric in the Report
A click in Search Console is recorded when a user selects a result that takes them to a page outside of Google Search. Clicks on ads, clicks that keep the user within Google’s interface (like expanding a “People Also Ask” section), and clicks on AI Overview citations that don’t leave Google are generally not counted as clicks in the Performance report.
Clicks are the most trustworthy metric in Search Console because they represent a definitive user action. A user saw the result, decided it was worth visiting, and clicked through to the site. There’s no ambiguity about whether the interaction happened. The Google disclosure about the impression bug explicitly confirmed that clicks were not affected by the logging error, which reinforces clicks as the metric to anchor analysis around when other data is uncertain.
One thing to watch: Search Console click data and Google Analytics session data won’t match exactly. Search Console counts clicks on the Google side, while GA4 counts sessions on the site side. Redirects, slow-loading pages, users who click but close the tab before the page loads, and tracking script issues all create gaps between the two numbers. A consistent gap is normal. A sudden widening of the gap is worth investigating.
Click-Through Rate: A Calculated Metric, Not a Measured One
CTR in Search Console is calculated by dividing clicks by impressions. Because it’s derived from those two inputs, any issue with either metric directly affects CTR. If impressions are inflated (as they were from May 2025 through early 2026), CTR appears artificially low. If impressions are underreported, CTR appears artificially high.
CTR is useful for comparing the relative performance of different pages or queries within the same time period, where the measurement conditions are consistent. It’s less reliable for comparing CTR across time periods that span data anomalies, algorithm updates, or changes in SERP layout (like the introduction of new features that change how much of the results page is occupied by non-organic elements).
The SERP landscape in 2026 includes AI Overviews, AI Mode, featured snippets, knowledge panels, People Also Ask, local packs, shopping results, video carousels, and image packs. A query where the organic result sits below an AI Overview and a People Also Ask section will naturally have a lower CTR than the same ranking position on a cleaner SERP, even if nothing about the page or its ranking has changed. CTR trends should always be read alongside an understanding of what the SERP actually looks like for the queries in question.
Average Position: A Blended Number
Average position in Search Console represents the average ranking of a page across all the queries it appeared for during the selected time period. A page that ranked position 2 for a high-volume query and position 45 for twenty low-volume queries could show an average position of 40+, which would make it look like a poorly performing page even though it ranks well for the query that actually drives traffic.
Average position is most useful when filtered to a specific query or a narrow set of related queries, where the blending effect is minimized. As a site-level or page-level aggregate across all queries, it’s often misleading. A page’s average position can improve (the number goes down) while traffic decreases, or worsen (the number goes up) while traffic increases, depending on which queries are entering or leaving the data set.
How to Use the Metrics Together
The metrics in Search Console work best as a system rather than individually. Each one answers a different question.
Impressions answer: how visible is the site for a given set of queries? Clicks answer: how many users actually visited the site from search? CTR answers: of the people who could have seen the result, what percentage clicked? Average position answers: where did the page typically rank for the queries it appeared in?
When impressions increase but clicks stay flat, the site is appearing for more queries (or the same queries more often) but those new appearances aren’t generating traffic. Possible explanations include ranking for queries where the result sits below the fold, appearing in positions where AI Overviews or other features absorb the click, or ranking for queries with informational intent where users get their answer from the SERP without clicking.
When clicks increase but impressions stay flat, the site is getting a higher CTR for its existing visibility, which usually means improved rankings for high-intent queries, better meta descriptions or titles earning more clicks at the same position, or queries shifting to SERPs where the organic result is more prominent.
When both impressions and clicks drop, either the site has lost rankings, queries the site ranked for have declined in volume, or a SERP layout change has pushed organic results further down. Search Console alone can’t distinguish between these causes, so pairing the data with rank tracking tools and SERP analysis fills the gap.
Why Clicks, Not Impressions, Should Lead Reporting
For link building and digital PR campaigns where the goal is driving organic traffic growth, clicks are the metric that connects directly to business outcomes. Impressions indicate visibility potential. Clicks indicate actual visits. CTR indicates conversion efficiency from impression to visit. Average position indicates competitive standing.
The temptation to lead with impressions is understandable because the numbers are larger and growth looks more dramatic on a chart. But as the eleven-month impression bug demonstrates, impression data carries more measurement risk than click data. Building reporting and strategy around clicks as the primary metric, with impressions and CTR as supporting context, produces more reliable analysis.
The Impression Bug: What Happened and What to Do About It
On April 3, 2026, Google updated its Data Anomalies in Search Console page with the following disclosure: “A logging error is preventing Search Console from accurately reporting impressions from May 13, 2025 onward. This issue will be resolved over the next few weeks; as a result, you may notice a decrease in impressions in the Search Console Performance report. Clicks and other metrics were not affected by the error, and this issue affected data logging only.”
A Google spokesperson confirmed to Search Engine Land: “We identified a reporting error in Search Console that temporarily led to an over-reporting of impressions from May 13, 2025 onward. Bug fixes are being implemented to ensure accurate reporting.”
Eleven months of inflated impression data across every Search Console property. Any analysis, report, or strategic decision made during that period using impression counts or CTR as inputs was working with inaccurate data. CTR calculations during the affected period would have appeared lower than reality because the denominator (impressions) was artificially large.
The impression bug also coincided with Google merging AI Mode data into Search Console totals in June 2025, which means impression trend lines from mid-2025 through early 2026 contain at least two significant data discontinuities. Year-over-year comparisons spanning this period are unreliable for impressions and CTR.
What to do now: annotate dashboards and reports to mark May 13, 2025 as a data discontinuity point. Use clicks as the primary metric for evaluating performance during the affected period, since Google confirmed clicks were not impacted. Export historical data before the fix fully propagates to preserve a record of both the pre-correction and post-correction numbers. Once the correction is complete, re-baseline impression data and recalculate CTR using the corrected figures. And resist the urge to interpret the forthcoming impression drop as a performance decline. The impressions are going down because the data is becoming more accurate, not because anything changed about the site’s actual visibility.
