Search engineers and SEO analysts at Online dot Marketing have observed a recent change by Google specifically the partial disabling of the &num=100
parameter. This shift has consequences for rank‑tracking tools, Google Search Console impressions decline, and average position increase SEO metrics.
What is the &num=100 Parameter?
- The
&num=100
parameter in Google’s SERP URL allowed users and tools to display 100 results per page. - Previously, using
&num=100
forced Google to return 100 organic results in a single page request. This was helpful for crawling, scraping, or gathering many results quickly. - This parameter was widely used within rank‑tracking tools and SEO analytic platforms to reduce the number of individual HTTP requests needed to retrieve 100 results, instead of doing ten separate requests for 10 results each.
What Exactly Changed – &num=100 SERP Change
- Around September 10, 2025, reports emerged that using
&num=100
no longer consistently returns 100 results per page. In many cases, only the first two pages (i.e. 20 results) show, or the parameter just fails entirely. - Tests showed that forcing 100 results sometimes works intermittently; in many other cases, the parameter is ignored. Google appears to have disabled or is testing removal of the
&num=100
SERP parameter. - It’s unclear whether this is permanent or part of an experiment and so far, Google has not issued public confirmation.
Read Also – The End of Third-Party Cookies and First-Party Data Strategy
Why Google Might Disable 100 Search Results Per Page
- To reduce malicious or excessive scraping. Serving 100 results on one page can be used by bots to gather data more efficiently, possibly inflating impressions in certain analytics tools.
- To align user experience: most users don’t navigate far into large result sets. Reducing large pages may improve loading, reduce server load.
- For more precise data in Search Console: reducing inflated impressions (especially from bots) might yield more accurate metrics. The &num=100 SERP change seems to align with observed changes in desktop impressions drop and average position increase SEO metrics.
Rank‑Tracking Tools Impact
- Many rank‑tracking tools relied on
&num=100
to fetch large SERP data sets with fewer requests. With that parameter disabled or unreliable, tools now may need to split a single query into multiple requests. That increases cost, latency, and complexity. - Some platforms are reporting missing rankings, errors, or incomplete SERP snapshots. Screenshots, daily sensors have shown gaps.
- Keyword Insights, for example, said: “Google has killed the n=100 SERP parameter … Instead of 1 request for 100 SERP results, it now takes 10 requests (10× the cost). This impacts Keyword Insights’ rankings module.”
Observed Effects: Google Search Console Impressions Decline & Average Position Increase SEO
- Since ≈ September 10, many SEO teams have reported Google Search Console impressions decline for desktop traffic. Accounts that had a high impression count before when using or being measured under the
&num=100
parameter saw sharp drops. - Alongside impressions dropping, average position increase SEO has been observed: average position metrics rose noticeably after the drop in impressions. That happens because when impressions drop (especially inflated ones), the average position “improves” (i.e. numerically becomes a lower number) because fewer low‑ranking impressions are counted.
- Some of what had been considered “The Great Decoupling” impressions rising without matching clicks—may have had contributions from inflated impressions via the &num=100 parameter. With that parameter being disabled, some of that decoupling may be explained.
Read Also – HubSpot AI Agent: Introducing AI-powered human–agent marketing tools
Technical Details & Evidence
Topic | Description |
---|---|
Timing | Change noticed ~ September 10, 2025. |
Scope | Affects desktop SERPs heavily; tools that retrieve many results per page. |
Behaviour | &num=100 parameter sometimes ignored; only two pages (≈ 20 results) show; tools splitting into smaller requests. |
Theory | Bot‑driven loads of 100‑results pages may have inflated impressions; average position metrics skewed. |
Impact on SEO Strategy at Online dot Marketing
- Monitoring changes in Google Search Console impressions decline is essential. Need to compare week‑over‑week after September 10 to establish a new baseline.
- When assessing keyword ranks, ensure rank‑tracking tools impact is accounted: missing or inconsistent data may result from the SERP parameter change rather than true rank shifts.
- Metrics like average position increase SEO may appear better (i.e. lower average position number) even without any improvement in ranking, purely due to fewer impressions at lower positions being recorded. Be cautious interpreting that.
- Reporting: dashboards and stakeholder reports should note if data was influenced by the &num=100 parameter change. Fluctuations may not reflect content changes or SEO work but these structural modifications.
What Teams Should Do: Adaptation & Best Practices
- Verify Data Sources Audit your rank‑tracking tools: confirm whether they are still using &num=100 or have switched to paginated or alternative SERP fetching.
- Set New Benchmarks After September 10, set your metrics against new data baselines. Because impressions have declined, average position metrics might change simply due to methodology.
- Investigate Impression vs Click Trends If impressions drop but clicks stay similar, it could indicate previously inflated impressions via &num=100. Conversely, if both drop, deeper technical or ranking issues may exist.
- Tool Vendor Communication Check support updates from tool vendors. Many are already adapting to the rank‑tracking tools impact caused by the &num=100 SERP change. Some may change how their systems count or fetch results.
- Alternate Data Gathering Methods Use API‑based data, SERP scraping ethically, or other validated sources if large page retrieval becomes inefficient. Accept that cost and latency may increase.
- Report Transparently In internal and external reports, mention that Google has changed behavior around disabling 100 search results per page, and that metrics pre‑ and post‑change may not be directly comparable.
Potential Further Consequences
- Lower server load and less bandwidth usage for Google? Possibly.
- SEO tools may price higher because sliding from single request fulfilling 100 results to multiple requests for smaller batches costs more.
- Some developers reliant on scraping large SERPs for content discovery may find this barrier more restrictive.
Summary & Key Takeaways
- The parameter
&num=100
has been partially disabled or deprecated; Google is enforcing or experimenting with changes that limit ability to fetch 100 results per page or at least making it unreliable. - This change causes rank‑tracking tools impact, Google Search Console impressions decline, and average position increase SEO metrics even without actual ranking improvement.
- For those relying on disabling 100 search results per page, it’s no longer a reliable lever; many tools must adapt.
- Strategy must shift: set new benchmarks, audit data, and adjust reporting.
What does the &num=100 parameter do in Google Search?
The &num=100 parameter instructs Google to display 100 search results on one page instead of the default 10.
Why did Google disable 100 search results per page?
Google likely reduced support to limit scraping, lower server load, and prevent inflated impression counts from bots.
How does the &num=100 SERP change affect rank-tracking tools?
Rank-tracking tools must now make 10 requests for 100 results, increasing costs, complexity, and potential gaps in data.
Why are impressions dropping in Google Search Console reports?
Impressions decline is tied to the &num=100 change. Fewer inflated impressions from lower-ranked results are now being recorded.
Why did average position increase SEO metrics improve suddenly?
Average position looks better because fewer low-ranking impressions are being counted, not necessarily due to improved rankings.