As a digital marketing consultant, SEO is one thing I’m conversant with. In fact, SEO is part of the digital marketing services I offer clients.
Every SEO consultant knows how important it is to have a search-engine friendly site. This helps clients’ businesses to rank higher which in turn exposes them to more potential customers. To achieve this, SEOs have a variety of tools at their disposal including XML sitemaps.
Judging by the number of misconceptions out there, people are surprisingly confused about XML sitemaps. Few take time to learn how to use this tool properly, even though it’s one of the most powerful SEO tools around. As a result, we get XML sitemaps with all kinds of URLs, some which shouldn’t be there at all. These poorly-designed sitemaps are then submitted to Google, further compounding the problem and resulting in poorly-ranked sites.
To help fix this, I decided to clear up a few things about XML sitemaps:
They don’t help to get your pages indexed.
A lot of people think that XML sitemaps play a huge role in getting their pages indexed. The truth is that Google indexes pages for two reasons: one- they found them and crawled them and two- after crawling them, they deemed the content good enough to be indexed. That’s it.
Consistency is crucial.
One of the mistakes I’ve seen my clients make is being inconsistent with the message they send to Google about certain pages. They incorrectly use meta robots and robot.txt thereby sending Google mixed messages. For instance, some marketers have found one client had included some pages in the XML sitemap only to block it in robots.txt, ensuring it couldn’t be indexed. Another decided to include certain pages on the XML sitemap only to set meta robots “noindex,follow” which also means these pages weren’t indexed.
For best results, be consistent. If you’ve blocked certain pages in robot.txt or meta robots, then it should be nowhere in the XML sitemap.
Keep an eye on overall site quality.
Additionally, if you have a lot of utility pages amidst the rest of your content, it’s a good idea to use either meta robots or robots.txt to prevent their indexation. This ensures that Google only sees and indexes relevant high-quality landing pages and doesn’t bother with the utility pages.
Clean things up.
Just because some pages aren’t in the XML sitemap doesn’t mean that Google will ignore them. To avoid this, do a site:search to see what Google is indexing and de-index those you don’t want in search results.
It helps to use dynamic XML sitemaps for big sites.
If you’ve got a big site, say over 100,000 pages, it would be impractical to try to manually keep all these pages in sync between meta robots, robots.txt and XML sitemaps. The solution is to set up rules logic to determine whether a page gets included in the XML sitemap or not.
If you need help with setting up your XML sitemap or with digital marketing, get in touch with me today.