EMA: IT and Data Management Research, Industry Analysis and Consulting

Cybersecurity & Marketing in the Wake of Google's September Data Disruption

Written by Chris Steffen | Oct 8, 2025 4:43:06 PM

The world of digital marketing thrives on data, and for years, Google Search Console (GSC) and third-party SEO tools have been our compass. But as an industry analyst, I can tell you that the past few weeks have felt like driving a car without windows! What initially appeared as inexplicable fluctuations in our SEO dashboards has coalesced into a clear, albeit unsettling, picture: Google quietly pulled the plug on a foundational data-gathering mechanism, unleashing a wave of disruption that impacts everyone from enterprise marketers to cybersecurity businesses.

The Quiet Parameter Change, the Loud Aftermath

Around 10–12 September 2025, Google disabled the functionality of the &num=100 URL parameter. For the uninitiated, this seemingly minor technical tweak allowed users and, crucially, automated SEO tools to view up to 100 search results on a single page. Suddenly, Google limited all standard searches to a mere 10 results per page, rendering the &num=100 command obsolete. The impact was immediate and widespread.

This wasn't just a minor inconvenience; it essentially broke the methodology of countless third-party rank tracking platforms overnight. Tools like AccuRanker, Semrush, Ahrefs, and Moz, which relied on this parameter for efficient data collection, found their systems generating incomplete or wildly inaccurate reports. Where they once made a single query for 100 results, they now need ten times the effort (and cost) to gather the same data. While enterprise-grade platforms like Semrush, with their robust infrastructure, have largely stabilized for top 10–20 rankings, smaller tools have been forced to adapt by limiting deep-ranking visibility or passing on increased operational costs to their users.

The Search Console Mirage: Bot Impressions Vanish

Perhaps the most jarring symptom of this change appeared in Google Search Console. Starting around 12–13 September, site owners globally observed a sharp decline in impressions on desktop search, coupled with a perplexing increase in average position. If you only looked at your GSC, you might have thought your site’s organic performance had either tanked or miraculously soared overnight.

However, the consensus among analysts is clear: the sudden drop in impressions was largely a reporting cleanup, not an actual loss of human visibility. The theory is compelling: those high-volume, automated queries from SEO tools using &num=100 were likely registering as "impressions" in GSC, creating an inflated sense of visibility, particularly for pages ranking beyond the first 10. Once those bot-generated impressions disappeared, our GSC data reflected a truer, albeit smaller, picture of actual human engagement. The corresponding "spike" in average position is simply a mathematical consequence, as the average is now calculated from a dataset heavily skewed toward higher-ranked, real-user-driven impressions.

This means much of the historical impression data we've used to draw conclusions about user behavior, especially concerning the impact of Google's AI Overviews, may have been "polluted" by bot activity. Our benchmarks for success and our understanding of shifts in organic traffic may require significant recalibration.

The Intersecting Chaos: Google Ads and AI Overviews

Adding to this complex scenario, a separate, short-lived Google Ads bug on 17 September temporarily flooded SERPs with an overwhelming number of sponsored links, pushing organic results almost entirely off-screen. While quickly resolved and unrelated to the &num=100 parameter, it served as a stark reminder of the inherent volatility of the search ecosystem.

This volatility is further amplified by Google’s mid-2024 introduction of AI Overviews. These AI-generated summaries often provide direct answers within the SERP, reducing the need for users to click through to source websites. If our historical impression data was already inflated by bots, how accurately have we been measuring the true impact of AI Overviews on human click-through rates? The answer is: less accurately than we thought.

Ramifications for Businesses, Especially Cybersecurity

For general businesses, the immediate directive is "don't panic." The fundamentals of SEO haven't changed: quality content and user experience remain paramount. However, reporting structures and KPIs must adapt. Marketers need to cross-reference GSC data with independent analytics (like Google Analytics) to confirm actual clicks and conversions. Communicate these industry-wide shifts to stakeholders to prevent misinterpretation of reports.

For cybersecurity businesses, these shifts carry unique weight:

  • Increased Competition for Top Spots: Cybersecurity is a highly competitive space. With organic visibility now heavily concentrated in the top 10 results, and with AI Overviews potentially answering questions directly, gaining and maintaining those coveted top positions becomes even more critical. This could necessitate increased investment in content excellence and technical SEO to outrank competitors and ensure discoverability.
  • Rethinking Content Strategy: If AI Overviews are directly answering user questions, cybersecurity firms need to rethink their content strategy. Instead of just aiming for answers, content might need to provide deeper analysis, unique insights, or tools that compel a click-through even after an AI summary.
  • Data Reliability for Security Research: Many cybersecurity companies conduct their own research, often relying on aggregated search data for market analysis or threat intelligence. If the foundation of this data (impression counts, ranking patterns) is subject to sudden, opaque changes and historical bot-pollution, it could impact the accuracy of their own strategic insights and go-to-market decisions.
  • Vendor Risk Awareness: This event is a stark reminder of vendor dependency risk. Cybersecurity firms, like all businesses, rely heavily on third-party tools and platforms. A core functionality change by a dominant platform like Google underscores the need for vendor diversification, robust data validation processes, and contingency planning.

Moving Forward: Adaptability and Transparency

Google's unannounced changes underscore a critical reality: we operate in an ecosystem heavily influenced by a single, powerful entity that can alter foundational rules at will. The absence of an official explanation for the &num=100 parameter removal only adds to the industry's frustration.

As an industry, our immediate task is to adapt. We must establish new baselines, refine our reporting, and continue to focus on creating genuinely valuable content. For businesses, especially those in cybersecurity, this means a renewed focus on resilient digital strategies that don't solely rely on one data source or one platform's unchanging functionality. The "Great Google Data Disruption of 2025" is a powerful lesson in adaptability, forcing us all to scrutinize our data more critically and build strategies that are robust enough to withstand the next curveball.