SEO Is Not That Hard

Google Search Console data looking odd? Here's why..

Edd Dawson Season 1 Episode 316

Send us a text

A major shift has occurred in how Google Search Console reports your website's performance data, and it all stems from a quiet technical change few noticed. Google has removed the "num=100" parameter from search results—a seemingly minor adjustment with far-reaching consequences for SEO professionals and website owners alike.

This parameter allowed SEO tools to view 100 search results on a single page, gathering vast amounts of ranking data efficiently. Its removal has thrown rank tracking companies into disarray, forcing them to make ten times more requests to collect the same data, dramatically increasing their operational costs and processing times. For these specialized tools, the choice now becomes whether to scale up infrastructure, reduce the scope of tracking, or pass costs on to clients.

The most noticeable impact for website owners appears in Google Search Console reports. If you've logged in recently and noticed your impressions dropping while your average position improved, this parameter change explains why. Those deep-page impressions (positions 50-100) were largely generated by bots using the num=100 parameter, not real users. Without these bot impressions, your data now more accurately reflects actual human behavior. Sites with fewer page-one rankings are seeing the most dramatic changes, with impression drops of up to 70-80% in some cases, while sites with strong first-page presence experience less disruption.

This cleaner data ultimately pushes us to focus on what truly matters—getting high rankings in positions where users actually engage. While your actual traffic and real rankings remain unchanged, the visibility into lower-ranking terms that might represent opportunities has diminished. Understanding this shift helps interpret your performance metrics correctly and make better-informed SEO decisions going forward. Want to discover how to identify the most valuable keyword opportunities despite these changes? Try our specialized tools designed to cut through the noise and focus on what drives real traffic.

SEO Is Not That Hard is hosted by Edd Dawson and brought to you by KeywordsPeopleUse.com

Help feed the algorithm and leave a review at ratethispodcast.com/seo

You can get your free copy of my 101 Quick SEO Tips at: https://seotips.edddawson.com/101-quick-seo-tips

To get a personal no-obligation demo of how KeywordsPeopleUse could help you boost your SEO and get a 7 day FREE trial of our Standard Plan book a demo with me now

See Edd's personal site at edddawson.com

Ask me a question and get on the show Click here to record a question

Find Edd on Linkedin, Bluesky & Twitter

Find KeywordsPeopleUse on Twitter @kwds_ppl_use

"Werq" Kevin MacLeod (incompetech.com)
Licensed under Creative Commons: By Attribution 4.0 License
http://creativecommons.org/licenses/by/4.0/

Speaker 1:

Hello and welcome to. Seo is not that hard. I'm your host, ed Dawson, the founder of the SEO intelligence platform, keywordpupilusercom, where we help you discover the questions people ask online and learn how to optimize your content for traffic and authority. I've been in SEO and online marketing for over 20 years and I'm here to share the wealth of knowledge, hints and tips I've amassed over that time. Hello and welcome back to SEO is not that hard. It's me here, ed Dawson, as usual, and today I'm going to be talking about a quite big change that's actually happened in Google, which seems like a small change, but it's actually had some quite big repercussions, so it's important to cover it. That is, that Google have removed the num equals parameter from the search results. Now, this might sound like a load of technical jargon, but it's actually quite simple and I will go through and let's explain it. So what was the num equals parameter? So the num parameter and more. You'll see people talk about the num equals 100.

Speaker 1:

And for many years, when you performed a search on Google, the default was you would see 10 results per page, but if you wanted to see more, you could simply click to the next page and then the next one after that and after that. But there was a little trick, a lesser known trick. There's a parameter that you could add to the end of the Google search URL, and that was the num parameter, and you could set that to a number, anything up to 100, and if you added, say yeah, num equals 100, to the end of the parameter, then google would show you up to the first 100 organic results on a single page. Now why would anyone want to do this? For the average user most people use google it's unlikely there's something they'd ever do or even know to do. Most people don't go past the first page, let alone scroll through 100 results. But for SEO professionals, researchers and anyone running automated tools to track keyword rankings, this was a really valuable little hack. It meant you could get a huge amount of data with a single request, because it's efficient and it's fast and so incredibly useful if you're trying to gather that kind of data. Imagine if you were a librarian trying to catalogue results in a library. If you could pull out a trolley with 100 books on and just run through them would be much more efficient than going and getting 10 books at a time and then going back, getting another 10 books, then going back getting another 10 books until you've got 100. You get the same amount of work but with 10 times less effort.

Speaker 1:

Now around mid-september, so probably somewhere around the 10th, 9th, 10th, 11th it's hard to point quite exactly, but I think it happened over a period of days, probably across different data centers. As they rolled this out. Yeah, google just quietly pulled the plug on the parameter. They didn't tell anyone, they just removed it and that's it just stopped working. And so if you put num equals 100 on now, you just still you just get the first 10 results. So the first impact was noticed by those firms that those companies that do rank tracking or who are clients of rank trackers. These are these rank tracking systems. They track thousands, even millions, of keywords for their clients and their data processes, collection processes, started failing because they built their entire infrastructure around grabbing 100 results at a time on a single request. Now and it's still up in the air about how these different firms are going to cope with this they can only get 10 results per request. They can obviously request the second page, the third page, the fourth page, the fifth page, etc. To get the same number of same amount of data. But it's going to take 10 times the requests and it might sound small to bake that.

Speaker 1:

Making a web, a web request, is a small thing to do, but to scrape google is actually quite difficult. It's something we do with keywords people use. Obviously we scrape. People also ask we scrape autocomplete. We scrape a few different things. We scrape results the first 10 results when we do keyword clustering and it's actually more complex than you think to do at scale.

Speaker 1:

So if you were to just do the same amount of browsing an average person would do through a normal browser which is accessing google a few times a day making a few requests, yeah, that's simple. But if you start to try and scrape any volume of keywords and it's not actually a huge amount before Google will start trying to block you, which means you have to use services like proxies and residential proxies and capture solving software and specialist services to do this, and they charge by the request, by the bandwidth that you use. So if all of a sudden you've got to make 10 times the number of requests, you are going to increase your cost by 10 times and also the actual operational time. So it costs to make an operation, it takes time for a piece of software to make a single request. So if you now have to make 10 times the number of requests, you're going to have to scale up that infrastructure to make 10 times as many requests and that's going to be a cost behind that.

Speaker 1:

So it's actually a really significant problem for these companies that are look that were previously getting 100 results and unfortunately it's not effective as it here as people use at this point, because we don't make those kind of requests. But I can see how this is going to be a big issue for lots of these companies and they've got they have now got a choice to make. They've got to either look to ramp up their infrastructure to cope with them to make 10 times the number of requests, or they've got to downgrade their service and only go, maybe make scrape page one and page two, maybe just get the first 20 results. They've got to think about whether they go back to the clients and say we can do this for you, but it's going to cost you a lot more money. So yeah, there's a lots of flux now in that industry and it's going to be interesting to see what people actually end up doing and where they see the value in the data.

Speaker 1:

Now, for anyone who isn't involved in the rank tracking doesn't have rank tracking software. Most people are getting this kind of data from Google Search Console, and this is where it actually gets quite interesting, because this change has made an effect on the number of results and the types of results that we're seeing in Google search console. So let's look at that. So I'm sure many of you are familiar with Google search console reports and specifically the performance report, and if you go and look at it, you may see that some of the numbers seem a little different now than there were a week or two ago. What You'll probably see is, and what you probably noticed before, that there was a huge number of impressions for keywords where your site would be ranking on page 5, 8, or even 10. And you'd scratch your head and think who's actually scrolling down that far?

Speaker 1:

Here's the crucial part. A significant portion of those deep page impressions were very likely coming from bots and SEO tools that were using that num equals 100 parameter. So every time an SEO tool checked a keyword ranking for a client, it scanned those hundred results and registered an impression for every single result on that page, even if a human never laid eyes on it. So now that the number 100 is num equals 100 is gone, those automated deep page scans have drastically reduced overnight and this has had an effect on the data that you will see in google search console. So there's kind of two major changes. There's a noticeable drop in overall impressions, and this makes perfect sense because all those bot driven impressions for positions 11 to 100 are largely gone. So your google search console data is now much more reflective of actual human behavior. If people don't scroll to page five, then your site on page five isn't getting impressions from real users.

Speaker 1:

The new impression count is arguably a more accurate representation of your visibility to an actual human audience. And secondly, you will see a sudden improvement in your average ranking position. Now this might seem counterintuitive at first. How can your average ranking go up? Now? This might seem counterintuitive at first. How can your average ranking go up if your impressions are down? But it's because the average position was being skewed by those very low ranking bot generated impressions. So when you remove all those 50th, 60th, 70th whatever position impressions from the calculation, the average of your remaining impressions, which are mostly from the first couple of pages, are going to naturally shift upwards.

Speaker 1:

So if you've logged into Google Search Console recently and thought that's odd, my impressions are down, but my average position looks better. This is very likely the reason. It isn't that your site suddenly shuts up in the rankings overnight or that Google is all of a sudden sending you less traffic. It's the data you're seeing. It's cleaner, it's more focused and it's more actually aligned with what actual human users are experiencing. And this is a big deal because it means we're going to have to re-evaluate how we interpret the Search Console data, because those impressions on page 5, they're always a bit of a vanity metric, something that looked good on a spreadsheet but it rarely translated into actual clicks for businesses. So now this data is pushing us to focus even more intensely on what really matters and that's getting those high rankings for positions where users will actually engage.

Speaker 1:

So first of all, if you log in and you see this big change, at least now you can understand it and see actually this is what happened. I haven't actually seen my visibility drastically reduced overnight. It's not a Google update in the terms of there's a change. There's a dramatic change to where my actual rankings are or what keywords I'm actually being surfaced for. It's just now that there's a little bit less visibility in those lower regions because these bots are no longer going down to that level. We may see this change in the future. If the bot track, if the rank tracking companies do decide to actually 10x the number of requests they make, then it might start to come back again. But for now you're going to see actually what's a more true picture and how big is this change?

Speaker 1:

I've seen it vary across a wide variety of sites that I've got access to in Search Console. I'm finding that sites lower traffic sites are seeing probably a bigger difference in impressions. So I've got one here that I'm just looking at and on Sunday September the 7th it was getting impressions of about 9,000 and a click-through rate of 0.3% and its average position was 61.9. And then, a week later, on Sunday September the 14th, its clicks were almost exactly the same but its impressions had dropped from 9,000 to 1,971. So they'd lost sort of 70%. Well, their click-through rate jumped from 0.3 percent to 1.7 percent. So that's a massive increase in the actual click-through rate. But remember, no extra actual clicks. We're just seeing an actual, more truly representative click-through rate for this site and its average position had jumped to 18 from 61.9. So you can see there there's a big swing and that's much more noticeable.

Speaker 1:

Now let's look at another one which is a much more highly trafficked site. That's got a lot more page one results. So let's look at the same dates again. So let's look at, yes. So let's say Sunday, september 7th, they were getting impressions of 107,000, click-through rate of 4.1, and an average position of 22. And then a week later, on the 13th, they were getting impressions of 79,000. So their impressions had dropped 30,000. So relatively it's not such a big drop but it's still a significant drop 30 110 000 to 80 000 ish. But they hadn't dropped off so many because they've got a lot of obviously much higher ranking results on page one and page two and their clicks who rated jumped to 5.4, so 4.1 to 5.4, and their position had jumped from 22 to 8.1, so roughly halved. You'll see here that this kind of site the more popular the site and the more high placed rankings it actually had to start with the drop is slightly less dramatic. But if you've got a site which is like just up and coming in the rankings, so you've got a lot of lower rankings and not so many on page one and two, you'll see a much bigger difference up to 80 difference, whereas this site here is seeing about a 30 difference, but it is noticeable.

Speaker 1:

If you look through your search console you will notice this, so it's worth going and having a look and seeing where you are Now. Where it's gonna have performance issues is if, as we do, you look back at your query data and look for queries that you aren't ranking highly for yet, but that you are ranking for. You're going to find less opportunities in this data. Less opportunities are going to rank. The best opportunities. Those on page where you're ranking lower, on page one and two, are still going to show up by the looks of it. So you'll still get those where the quick, easy wins are, but where there's some of the trickier ones further down, where I was ranking is, that is going to take longer to work out because you're not going to get that data naturally surfaced for you in Google Search Console without you having to do anything.

Speaker 1:

So, conversely, it might mean you have to get involved in rank tracking and generate huge keyword lists of things that you think you might be ranking for or you want to rank for, to then run them to a rank tracker that will do that tracking down to those lower page levels now to try and get the data back. But that's completely up to you and it's going to be interesting to see how the rank tracking companies actually respond to this, because we may see that they decide to bite the bullet, take the hit, do the work and carry on going down to 100, or maybe they might go down to 50 or something like that, and we might see that this impression data kind of it's false, it's bot impression data might continue to go up. And you might say I'm not missing this impression data. It's not, it doesn't help me. But it is interesting that the bots were actually generally feeding in data into your search console, because everyone who was looking at a whole range of keywords that you were actually ranking for, every time they search for that, it would create that impression and then put it into your queries in search console so you could get at that data. So there is a bit of a loss, even for those of us that weren't doing rank tracking.

Speaker 1:

But the key thing to remember is, overall, this isn't going to affect how much actual traffic you get to your site. You'll see that, which should remain stable. So it's nothing to panic about seeing those impressions drop. This is just like separating noise from actual data, so don't worry about that, but do be aware of what has actually caused this. It's important to know that. So that's it from me. If anyone's got any comments on this, any thoughts, then do get in touch. All the details are in the show notes and until next time on this, any thoughts? Then do get in touch. All the details are in the show notes and until next time, keep optimising, stay curious and remember SEO is not that hard when you understand the basics. Thanks for listening. It means a lot to me.

Speaker 1:

This is where I get to remind you where you can connect with me and my SEO tools and services. You can find links to all the links I mention here in the show notes. Just remember, with all these places where I use my name, that Ed is spelled with two Ds. You can find me on LinkedIn and Blue Sky. Just search for Ed Dawson. On both.

Speaker 1:

You can record a voice question to get answered on the podcast. The link is in the show notes. You can try our SEO intelligence platform, keywords People Use, at keywordspeopleusecom, where we can help you discover the questions and keywords people are asking online. Pus those questions and keywords into related groups so you know what content you need to build topical authority. And finally, connect your Google Search Console account for your sites so we can crawl and understand your actual content, find what keywords you rank for and then help you optimize, continually refine your content and targeted, personalized advice to keep your traffic growing. If you're interested in learning more about me personally or looking for dedicated consulting advice, then visit wwwEdDawsoncom. Bye for now and see you in the next episode of SEO is not that hard.

People on this episode