SEO and Internet Marketing

Google eliminates 100 results per page: What is the impact for SEO and users?
By: Skynet-Editor-2
Google removed &num=100 parameter

In early September 2025, Google modified the way it handles the &num=100 (or similar) URL parameter in search queries. It means, appending this parameter to a Google search would return 100 results per page earlier. However, this capability has been largely disabled or made inconsistent now.

In tandem, various SEO tools, rank-tracking services, and third-party platforms that relied on fetching 100 results in one request are being significantly impacted.

Google implemented this change for several reasons, like:

  • Preventing scraping and automated data collection

    The ability to fetch 100 results in one go made it easy for tools to collect large volumes of search result data quickly. Removing or disabling that parameter complicates automation and increases the cost (in requests, bandwidth, etc.).

  • Protecting infrastructure and managing load

    Fewer large-batch requests (or fewer results per request) help Google manage server load, bandwidth, and possibly even counter abuse of its search result APIs or scraping. Although this is speculative, many in the SEO community interpret this as part of Google’s tightening controls.

  • Shifting towards official APIs/paid access

    By making scraping less efficient, Google may be nudging more users and organizations toward using official tools/APIs that are subject to quotas, usage terms, and possibly costs.

  • User experience/consistency

    There is some sense that having fewer results per page helps maintain a cleaner, more predictable user experience. Loading many results can degrade performance or complicate how results are rendered for different devices.

Effects of the removal of this parameter

Who is affectedHow
SEO & rank-tracking toolsBig hit. Tools that used to retrieve 100 results in one request now need 100 separate requests to get the same depth of data. That means higher costs (especially for tools doing this at scale), more processing, more bandwidth.
Webmasters/Site ownersIndirect effects: changes in reported metrics (e.g., in Google Search Console), possible misalignment between what SEO tools report vs. what Google reports. May see metrics like impressions or “average position” seem distorted. Also, harder to track beyond page one or page two if tools lose access.
Content strategists & SEO professionalsEmphasis returns more strongly to securing top positions (top 10 or first page), because deeper results are harder to monitor. Also, more effort required to verify competitor-position deeper in SERPs.
Users search behaviorLess obviously impacted at first glance. If users like loading more results per page (power users) might notice fewer results in a longer scroll or requirement to click through pages more. Possibly a small effect on speed or performance of result rendering. For majority, perhaps no major day-to-day change.
SEO tool providersCosts may go up. Semrush, Keyword Insights, etc., have already reported needing workarounds, adjusting their architecture, possibly increasing their resource usage.

Consequences of recent changes in Google search results!

Here are some of the more nuanced or longer-term consequences:

  • Metrics distortion

    Search Console data and third-party SEO tool data may temporarily become “noisy” or misleading. For example, if fewer results are shown, organizations might see a drop in desktop impressions (because some results beyond pages 1 or 2 are no longer being fetched or reported). That can lead to the average position rising (i.e., worse position), even if nothing changed on the site.

  • Increased cost of deep SERP analysis

    If organizations want to monitor results well beyond the top 10 (maybe top 50 results or so), that now takes more API calls, more processing, more bandwidth – raising both financial and computational costs.

  • Focus on optimization for top rankings

    With deeper results less visible or more expensive to track, more attention will likely go into featured snippets, etc. Others may deprioritize long-tail SERP tracking (below top-20) because the return on investment is lower given the new costs.

  • Tool provider innovation/pricing pressure

    Some tools may introduce new pricing models, limit “deep SERP” features, or provide summary data rather than full data dumps. Users may see fewer “free” features or stricter limits for large requests.

  • Potential crawl/indexing side-effects

    Although the main change is about how search results are fetched/displayed/used by tools, the broader trend (removing certain URL parameter tools, simplifying breadcrumbs, URL displays, and changing how some structured data is handled) indicates Google’s increasing automation in handling URL parameters and other metadata. This means webmasters need to be more careful with canonicalization, avoiding duplicate content, having a clean site structure, etc.

SEO best practices to accommodate new changes – Must know!

Given these changes, some strategies may need to be adjusted. Here are recommended practices going forward:

  • Prioritize page-one rankings

    Since tracking deeper positions becomes more expensive and less reliable, put more effort into content, on-page SEO, link building, etc., that moves important pages into the top 10.

  • Optimize for user experience and relevance

    Even more important to satisfy user intent, have strong metadata (title, snippet, description), good mobile performance, etc., because “visible” SERP real estate becomes more competitive.

  • Check canonical tags and URL structure

    Ensure that pages accessed with parameterized URLs are canonicalized to the clean version. Prevent duplicate content. Make sure the site does not generate unnecessary parameter-based URL variants.

  • Monitor metrics closely

    Watch for changes in Search Console metrics like impressions, click-through rates, and positions. Be aware of the date when changes rolled out, so it will be easy to interpret data anomalies properly. Use comparisons over time.

  • Use official APIs where possible

    If the work requires fetching SERP data at depth, using Google-approved APIs (if available) is often more reliable, even if more constrained. It may also offer better support or stability. Tools may evolve to partner directly with Google or use compliant methods.

  • Adjust tool subscriptions/budgets

    Be ready for SEO tools to possibly increase costs or change their service tiers. Evaluate whether deep SERP tracking is delivering sufficient value to justify its cost.

  • Content strategy focus

    Since the deeper SERP real estate is harder to monitor, investing in content that targets queries likely to rank higher, featured snippets, rich snippets, etc., may yield better ROI.

Possible downsides and challenges

  • Transparency loss

    Less visibility into deeper SERP behavior might make it harder to spot emerging competition, new keyword opportunities, or content gaps that live beyond page two.

  • Barrier to entry increases

    Smaller SEO tools, individuals, or organizations with fewer resources may struggle more with cost and complexity; this could consolidate an advantage toward larger firms.

  • Discontinuity in historical data

    Comparisons over time (pre- vs post- change) may be confusing or misleading if organizations are not careful. Metrics might show drops or shifts that are purely artifacts of the change, not actual performance shifts.

  • Risk of overlooking non-top 10 opportunities

    Some queries that rank in positions beyond page one still generates useful traffic; if fewer people monitor them, those opportunities may be ignored.

Also read: Google SERP Improvements after website migration

Wrapping up

The removal of the &num=100 search results parameter is more than just a technical tweak. It represents a shift in how Google allows access to its search data - making deeper SERP analysis more costly, tools less efficient, metrics more volatile, and placing larger emphasis on higher rankings and greater content/value.

For SEO professionals and website owners, the key will be adaptability: understanding that some “luxuries” in visibility are being removed, doubling down on the basics (quality, relevance, canonicalization), and rethinking how much value is derived from deep SERP scraping vs what is visible near the top.

At Skynet Technologies, we keep track of every Google update that influences SEO and user experience. The removal of the 100 results per page feature is just one example of how search strategies must adapt quickly. If you’re looking to refine your SEO approach, strengthen visibility, and stay ahead of these shifts, our expert SEO services can help. Connect with us at [email protected] to build a search strategy that thrives in Google’s evolving landscape.