This is how easily months and months of hard SEO graft can be taken away by one mistake:

Major drop in keyword rankings – Semrush
When reviewing rankings for one of my clients recently, I noticed that they had started to tank over the weekend. F@*k.
Digging a little deeper showed that it was some of our highest priority, highest search volume keywords that were dropping, most of which were keywords the home page ranked for. Double F@*k.
But the home page didn’t just stop ranking for the high priority keywords, it stopped ranking for any keywords at all because it had been removed from Google’s index!
What to check when your web page drops out of Google’s index
I started doing some digging.
Step 1: View source and check meta robots. I manually viewed the source code of these pages and could not find a noindex tag. To do this, right click on the web page you want to check and click on ‘View Source’:

When viewing the source code, hit CTRL + F and search for ‘Robots’. If the meta robots tag says ‘noindex’, you’ve found your issue and that needs to be changed ASAP:

If that doesn’t solve it and your meta robots is set to ‘index’, then move on to step 2.
Step 2: Crawl the site with Screaming Frog. Here you can check the ‘indexability’ and ‘indexability status’ which should tell you why the page is non-indexable. This might say there is a ‘noindex’ directive, or that the page is canonicalised elsewhere.
In my example, I crawled the site and the results came back that it was indexable, so I was a bit stumped.
Step 3: Inspect the URL in Search Console. At the top of the page in Search Console, you can add in a URL and hit enter. This will then give you details about that page’s indexabiltiy status according to Google. Test the live URL to be sure, and then hit ‘Request indexing’.
In my example, I inspected a few pages in Search Console and then tried to Request Indexing but I could not complete the request because Search Console was reporting a ‘noindex’ tag on my client’s pages.
So, what the hell was going on?
Step 4: Crawl thee site using Screaming Frog with JavaScript rendering enabled. Sometimes, JavaScript is a b*tch and can cause hidden problems, so set your Screaming Frog crawler up to render your JavaScript so that it sees the final version of the site that Google / users will see.
Go to Configuration > Crawl Configuration > Rendering > JavaScript and then crawl your site.

In my example, I recrawled the site with JavaScript rendering enabled and, bingo, a noindex tag was picked up in the JavaScript (the ‘indexability status’ column was showing the noindex tag, whereas it was not showing this when I crawled it without JavaScript rendering enabled).
I took a look in the CMS and found a new piece of JavaScript code that had been added without my knowledge. I tested removing it and the site was indexable again. Phew.
So, now the site is fully indexable again, but we have to wait and see how long it takes for Google to reindex it and whether or not we regain our former ranking positions.
Majorly frustrating for me, but at least it gave me my LinkedIn content for the day 😝
So, if you ever find yourself in a similar position, make sure to test your website’s indexability status with JavaScript rendering enabled, because that might just be the issue!
But most importantly, make sure your SEO Consultant is aware of any changes being made to the site, as prevention is always better than cure 🙂