So few things can tank your blog traffic more quickly than when your sitemap file gets messed up or you make a mistake with your robots.txt file and search engine crawl bots from crawling your site pages and posts properly. I recently had this problem with my WordPress blog and only found out about it when I was using an seo position checker to see where I was ranking for some keywords on a blog post only to find out that the blog post wasn’t indexed by Google at all, period, didn’t even show up in Google search results page.
I had found after the fact that a minor change to my YOAST SEO Plugin had caused thousands of URI’s to be yanked out of my XML SiteMap which causes thousands of URL’s to be de-indexed from Google.
I have over 18,000 indexed URI’s from my site as it has been in operation for over 11 years with many posts per day in most weeks, it has quite a lot of overhead to get it indexed and consumes quite a bit of time to SEO sculpt individual posts which involves doing a lot of SEO competitor research before finalizing my post for publication, so when I do all of that work and find out that my post isn’t indexed at all, this is a problem and a very big problem. If your page isn’t indexed or marked with noindex then this means no matter how much SEO you do for a post your post isn’t even being considered by the Google search engine to be listed on the search engine results page.
So what happened?
In YOAST SEO Plugin you have the ability to make decisions on whether you want to include not only posts and pages, but categories and tags in your SiteMap. This can greatly inflate the amount of URI’s listed in your sitemaps file but at the expense that it can increase how many indexed pages you have listed on the Google Search Results page. Now in some cases when using an seo position checker I would find that a tag page can rank higher for the keywords I was targeting for a post page and get more traffic for the targeting keywords than the post itself. I couldn’t overly explain this anomaly, but I found out the hard way when I accidentally caused a noindex on over 1200 tag pages and saw my site traffic tank over the next few days.
Even after I had fixed the issue and Google said it would take 30 days to re-index stuff on it’s own, nothing has yet fully indexed. I have had to manually submit re-indexing for many of my posts and this is a post by post basis. Webmaster tools is an invaluable resource for finding issues with indexing and posts, and using the URL inspection you can find out if there are indexing errors with your site and pages and even request indexing of a page where it usually takes about 1-2 days for it to be picked up and indexed again.
The noindex if accidentally triggered by either a change in robots.txt manually or with a setting in YOAST SEO plugin can be a pain to undo and you could unexpected losses in traffic if you don’t be really careful with your decisions around what to index and what not to index when you consider your sitemaps. Just remember to regenerate your sitemaps and immediately resubmit them to Google Webmaster tools for reindexing and be prepared to manually have to go and request indexing for each post if you don’t want to be patient and wait until all the organic crawling and indexing finishes.
To find out if a post is indexed or not, you can always do the manual search on Google by typing “site:www.(yoursitename).com [/uri-to-check]” and see if the exact page/post comes back in the search results, if not then most likely it is not indexed and you should log into Web Master tools and do a URL inspection to confirm, look for errors, no-indexes and if you don’t see the page at all come up, it means it is missing completely from your sitemap altogether.
Webmaster tooks is invaluable not only for helping you get your sitemap listed and index, but the keyword tool analyzer is ideal at assisting with helping you find keywords to do your SEO competitor research and decide what short or long tail keywords you want to target and rank for. Just remember to tread carefully when making changes to your sitemap and robots.txt and be quick to check for any errors to correct.