Categories
Subscription Website Publishing

Gain SEO Control Back with 7 Sitemap Tools to Use Before a Magazine Website Redesign

There are ways to control how Google reads your website with sitemap tools. While we assume most of our publishing friends are creating kosher content and abiding by the rules, if Google has considered you a black sheep in the past, you can reinvent yourself by cleaning up your sitemap. If you’ve been good, you

How to get out from under Google’s thumb to diagnose and track problems with these seven sitemap tools for magazine websites

Do you ever feel like you’re in a controlling relationship with Google? They sort of micromanage all of your publishing activities without actually telling you what they want, am I right? If Google was your boyfriend, your friends would all tell you to kick him to the curb.

But publishers can’t cut the apron strings with Google, we’re in a polyamorous relationship with them and the rest of the internet while they plot behind the Oz curtain about our futures.

Fortunately, there are ways to control how Google reads your website with sitemap tools. While we assume most of our publishing friends are creating kosher content and abiding by the rules, if Google has considered you a black sheep in the past, you can reinvent yourself by cleaning up your sitemap. If you’ve been good, you can always improve how well your website is indexed, and often the updates are pretty easy!

If you’re considering a magazine website redesign—especially if you’re considering a redesign—the first thing you’ll want to do is see where your current website is deficient. From there, you’ll want to fix those deficiencies and re-deliver Google an updated, cleaner sitemap.You have a lot of content to index, and it deserves to be indexed! Below is how you can do all the above and more via some free and paid sitemap tools.

[text_ad]

Do you have a sitemap?

If you’re already sitemap savvy, skip on to the next step.

If you’re not sure, the precursor to fixing your sitemap (the main guide Google uses to index and analyze your website), is to make sure you have one. Use SEOSiteCheckup’s tool to analyze your website and find out.

If you do have a sitemap, instead of assuming it’s been submitted to Google, I’ll just put it out there that your sitemap should have at some point been submitted to the major search engines. Yoast provides instructions for submitting your sitemap:

If you don’t, consider Yoast, a tool that most of our clients use. It’s a WordPress plugin that automatically creates your sitemap, and updates it regularly as you update and publish new content. It does not however, automatically re-submit your sitemap because there are rare occasions when that needs to occur.

The only reason why you’d need to keep resubmitting is if you have found a major error in a bulk of articles, and have fixed it. Or for example, you have re-categorized hundreds or thousands of articles. Any time you make major site structure changes, which include URL syntax changes, it’s wise to resubmit.

While Yoast is helpful and convenient because it’s built into your website, there’s a lot more to controlling how Google sees your website through sitemaps. So it’s helpful for publishers to take an interest in bonus third-party sitemap tools so that you can not only build and submit your sitemap, but also optimize it. With this power, you can diagnose and fix issues that Google is holding over your head.

Step 1: Establish your sitemap baseline

The first step of the process is to claim your site in Google Search Console, which will require dropping a piece of code on your website— this is easier if you already have Google Tag Manager already set up.

Google Search Console can tell you how many pages you’ve submitted through your sitemap, compared to how many of those pages have been indexed by Google. There will always be a discrepancy between the numbers because of pages behind paywalls, and Google just simply never indexes 100% of a website. However it becomes a great diagnostic tool if you have 10,000 pages and only 5% of those pages are being shown as indexed. That’s where you may decide to dig deeper to fix the deficiency. From Google Search Console, you can learn the following about your sitemap.

  • How many pages are submitted through your sitemap, and how many are indexed.
  • How many images are submitted through the sitemap, and how many are indexed (this can be increased by adding meta data to your images)
  • Your index status over time, so you can watch for dramatic spikes and drops. Normally, it should stay about even, rising slowly over time.

Google Search Console has a lot of great SEO uses, but this is how it specifically helps with sitemap diagnosis. From here, you’ll want to diagnose and fix the issues if you see indexing on the decline, which can be done through several other third-party sitemap tools, coming up next. Even if it’s not on the decline, it’s worth auditing your sitemap if you’ve never done it before to make sure you shouldn’t be indexed much higher.

Step 2: Obtain your crawl data and diagnose issues

There are three sitemap tools we have reviewed personally and can recommend.

These sitemap tools offer similar data, but processed and delivered differently. It’s all about preference. These sitemap tools below can help you identify duplicate content, find broken links, analyze meta data, audit your redirects, and create much more tactical data that will improve your sitemaps in the eyes of Google.

ScreamingFrog SEO Spider is the least expensive option of the three at about $200 per year (£149.00). It lets you run crawls on your website whenever you want, and then gives you feedback on the crawl to diagnose issues that might be affecting indexing. Screaming Frog runs on your computer, so it’s not as fast as cloud-based programs and larger publishing sites with 100,000+ URLS could benefit from cloud-based tool.

DeepCrawl is cloud-based and starts at $72 per month for 100,000 URLs, but can handle up to 4 million. We’ve found it has easier to understand their data because they offer visual reports with a comprehensive dashboard. Also, you can schedule regular crawls and won’t run down your computer because it’s all done from the cloud.

SiteBulb, which is currently free because it’s in public open beta for at least the next few weeks, is a competitor to ScreamingFrog because it’s downloadable software, rather than cloud-based. However, it’s somewhat of a hybrid and can handle a higher volume of URLs, up to 500,000 pages with ease.

DeepCrawl and SiteBulb, per our review, have more user-friendly reporting on the fixable items on your website, whereas ScreamingFrog offers just as comprehensive reports, but without a visually-friendly dashboard that can be handy especially if you’re presenting this information to a team.

Step 3: Analyze your crawl data

Finally, after you have used one of the sitemap tools above to produce crawl data and errors, you can use a third type of tool that will give you an in-depth analysis of each URL on your website. This will be incredibly insightful, but requires you complete steps 1 and 2 above first.

URLProfiler.com offers a free trial, or about $25 per month for 1,000,000 URLs per import. This tool uses your sitemap and crawl exports to analyze every URL on your website. It will pull social share data and link metrics. It will do page speed checks, and mobile-friendly tag checks. It will take a URL from your page to see how many external links are linking to it, and then report that number back to you. That’s valuable, because if a page is linked heavily, it’s also likely ranking high. But more importantly, it’s good if you’ve ever participated in nefarious link building in the past with a black-hat SEO company, and need to clean up those links. 

Additionally, the tool will analyze your content on each URL by telling you how many words are on the page and it will do readability and sentiment analysis for each page. It’s also a great tool for competitor intelligence, because you can analyze any website and get a full inventory, showing what their most valuable articles are. With the use of proxies, you can also check the index status of pages, and will pull in Google Analytics data for each URL.

Ultimately, if you want to know why a page is performing, or not, you can more quickly check basic items like page length, meta data and readability with a quickness you didn’t have before. You can use this data to improve the pages and get more of them indexed from your sitemap.

Once you’ve used these sitemap tools, Google Search Console updates once per week, so after you’ve diagnosed and corrected the issues, give it 7 days to show the difference.

Are you about to start a redesign for your magazine or newsletter website and looking for some guidance? If your publishing business is on the verge of a major transition and you’re looking for leadership on that journey, please schedule a time to speak with us. We have considerable experience with restructuring websites and publishing teams to accommodate digital transformations, and would like to help you with this significant change.

By Don Nicholas

Founder & Executive Publisher

Don Nicholas serves as Executive Publisher for Food Gardening Network and GreenPrints. He is responsible for all creative, technical, and financial aspects of these multiplatform brands. As senior member of the editorial team, he provides structural guidance, sets standards, and coordinates activities with the technology and business teams. Don is an active gardener whose favorite crops include tomatoes, basil, blueberries, and corn. He and his wife Gail live and work in southern Massachusetts surrounded by forests, family farms, cranberry bogs, and nearby beaches. Don is also the Founder of Mequoda Systems, LLC, which operates and supports numerous online communities including I Like Crochet, I Like Knitting, and We Like Sewing.

Leave a Reply

Your email address will not be published. Required fields are marked *

Exit mobile version