In the world of SEO, when Google announces a major update, your SEO plan can only sink or swim.
In similar fashion to their recent update to AdWords and Google My Business, Google is in the process of redesigning their data collection platform: Google Search Console (GSC). Throughout 2018, Google is sending out Google Search Console reports directly to all webmasters to refresh them on website crawling errors that may be negatively affected in Google Search results.
In the search marketing world, history proves that when Google invests heavy resources into innovating one of their products, they’re aiming to make search marketing more efficient, more measurable and, ultimately, more valuable.
For marketers, updates to Google Search Console means there is a new “standard” for websites to achieve for SEO success. Based on the early beta testing phase currently available to all webmasters, there are new tools in the SEO roadmap to leverage your website rankings over our competition. But, you need to know the “rules of the road” in order to get the best results without limiting resources.
First Things First: What is Google Search Console?
Google Search Console, formerly called Google Webmaster Tools, is a free-to-use tool offered by Google that helps monitor and maintain your site’s presence in search results. Use GSC to:
- Submit new content for crawling, and remove content you don’t want shown in search results
- Maintain your site with minimal disruption to search performance
- Monitor and disavow low-quality backlinks that may be negatively impacting search rankings
GSC also provides greater depth into data on your site’s organic search presence than what displays on Google Analytics. If you’re familiar with Google Analytics, you may know that if you try to investigate top search queries you’ll find this data is (not provided) or (not set). These are placeholder names that Analytics uses when it hasn’t received any information on the selected data.
Why does Google Search Console provide more organic search data than Google Analytics?
Search Console collects data across all Google searches while Google Analytics measures traffic exclusively to your website. Google Analytics is not set up to report on users that did not visit your website (missed opportunities), while GSC offers predictive results that measure how many times your website displayed in search results without being clicked on.
Through GSC, you can discover how Google Search sees your site:
- Which queries caused your site to appear in search results?
- What landing pages appear most on organic search?
- Which sites are linking to your website?
What’s New: New Features for Actionable Insights
Through 2018, Search Console is bringing users a completely redesigned product to help manage your site’s presence on Google Search. The new Google Search Console is rebuilt from the ground up to provide the tools and insights that webmasters and SEOs have been asking for.
The new platform can now confirm which of your pages are indexed and get information on how to fix indexing errors. Through GSC, we can also monitor site performance on Google Search with 16 months of data (to enable year-over-year comparisons).
Google’s update to GSC is not done yet, so over the course of 2018, the new tool will continue to add functionality from the classic Search Console. Until the new GSC is complete, both versions will live side-by-side and will be easily interconnected via links in the navigation bar, so we can use both.
The new Search Console addresses the most actionable insights by creating an interaction model which guides SEOs through the process of fixing any pending issues. GSC also added ability to share reports within an organization in order to simplify internal collaboration.
By Popular Demand: More Data!
Google Search Console expands the search query/landing page impression data to the new Search Performance report. In the past, GSC offered only 90 days of data in a report called Search Analytics. Data older than 90 days was wiped from the system with no way to recover.
With the new report, users now have access to 16 months of data, recovered from all historic data previously wiped. The increased time parameters make analyzing longer-term trends easier and enable year-over-year comparisons.
More data gives the big picture, in order to make actionable insights on your SEO campaign and its performance.
Index Coverage Report: Common Errors, What They Mean & How to Fix Them
Why is my site not ranking under important keywords? In order to rank on Google, Google must be able to index your content appropriately.
Recommended for You
The new Index Coverage report provides insight into how Google indexes URLs on your website. If your site has errors, Google Search Console may have already emailed your webmaster. If you’ve recently received an Index Coverage report, you may be wondering how to resolve the errors on your site.
Note: The Index Coverage report works best for sites that submit sitemap files. Sitemap files inform search engines about new and updated URLs. Once you’ve submitted a sitemap file, you can use the sitemap filter over the Index Coverage data, so that you’re able to focus on an exact list of URLs.
Don’t panic! Many of the following errors found in the Index Coverage Report are harmless to your bottom line, and all are fixable with the support of a web developer and/or hosting provider. Save time and money by prioritizing your site’s errors with the breakdown below.
Server error (5xx):
- Definition: Your server returned a 500-level error when the page was requested.
- What it means: When clicked, the URL on the report will show “Internal Server Error”.
- How to fix: Contact your hosting provider to ensure all URLs are found, without timeout (caused by slow server response times). If the reported URL does not contain site copy, (i.e. home page, service/product pages, blog posts, etc.) the error does not carry significant weight on Google.
- Definition: The URL sends users to a missing page or path.
- What it means: The reported URL either was a redirect chain that was too long; it was a redirect loop; the redirect URL eventually exceeded the max URL length; there was a bad or empty URL in the redirect chain.
- How to fix: Determine the appropriate path for the URL to redirect to. Re-write your htaccess file to correct the URL redirect.
Submitted URL blocked by robots.txt:
- Definition: The page is blocked from Google’s crawl.
- What it means: The robots.txt file on your site is blocking Google from crawling the reported URL. Webmasters utilize robots.txt files to control how much content is visible on Google search. Not every URL on your site needs to be crawled by Google, especially if they guide the formatting of a website and do not contribute content for the end user. For example, WordPress sites use /wp-content/ on URLs used by plugins and formatting tools. Most webmasters prefer to block this content from Google’s crawl so that the search engine only crawls unique content developed for the end user.
- How to fix: If you’ve determined that the reported URL must be included in Google’s crawl because it provides content for the user, then revise your robots.txt file to include the URL. You can do this through your FTP login or through Yoast SEO (if you use WordPress). Following revision, test your page using the txt tester.
Submitted URL marked ‘noindex’:
- Definition: Your sitemap contains a page marked for Google indexing, but the page has a ‘noindex’ directive either in a meta tag or HTTP response.
- What it means: Your sitemap is sending conflicting messages to Google: crawl or do not crawl.
- How to fix: If you want the reported URL to be indexed, your webmaster must remove the tag or HTTP response.
Submitted URL seems to be a Soft 404:
- Definition: Your server returned what seems to be a soft 404 on the reported URL.
- What it means: When Google crawled your sitemap, it found a URL that did not load and did not deliver a “404 Page Not Found” page. This may not always be a true 404 error. Clear your local cache (CTRL + F5) and see if the page loads on your browser. If so, Google should pick it up in its next index.
- How to fix: If your page is no longer available, and has no clear replacement, it should return a 404 (not found) or 410 (Gone) response code. If your page has moved or has a clear replacement, return a 301 (permanent redirect) to redirect the user as appropriate.
Submitted URL returns unauthorized request (401):
- Definition: Your sitemap contains a URL that Google returned a 401 (not authorized) response.
- What it means: Your webmaster put parameters on your site to block Googlebot from crawling your site, potentially including images and videos.
- How to fix: Either remove authorization requirements for this page, or else allow Googlebot to access your pages by verifying its identity.
Submitted URL not found (404):
- Definition: You submitted a non-existent URL for indexing.
- What it means: When Google crawled your sitemap, it found a URL that redirected to a “404 Page Not Found” page.
- How to fix: It is not always appropriate to redirect pages you’ve removed from the site to a different page of the site. For example, if you deleted a page about a product you no longer offer or service you no longer provide, it wouldn’t make sense to redirect URLs to a different product or service they weren’t searching for. Simply remove the page from your sitemap so that Google Search Console can crawl your site appropriately.
Submitted URL has crawl issue:
- Definition: Google encountered an unspecified crawling error that doesn’t fall into any of the other reasons.
- What it means: See above.
- How to fix: Try debugging your page using Fetch as Google. This tool should offer you specific parameters which inhibit Google’s crawl of your URL.
Fixing Crawl Errors: How to Maintain & See SEO Results
I’ve corrected all of my site errors. When can I expect to see results from my SEO?
If your site had significant crawl errors on your end-user content, fixing these issues may quickly swing your site into positive organic keyword growth. For example, if your Products page was blocked by robots.txt file, it was as if you had your “Closed” sign up when you were open for business.
Protect your site from new errors by properly maintaining your SEO. New Search Console is set up to report alerts when the platform detects new issues and helps you monitor their fix.
Often, fixing Search issues involves multiple teams within a company. Giving the right people access to information about your site’s index status is critical to improving an implementation quickly. Within most reports in Search Console, the share button on top of the report creates a shareable link to the report so the right person on your team can address and resolve.
The new Search Console can also help you confirm that you’ve resolved an issue, and will inform Google to update their index accordingly. To do this, select a flagged issue, and click “validate fix”. Google will then crawl and reprocess the affected URLs with a higher priority, helping your site to get back on track faster than ever.
More Features to Come!
Google Search Console will be rolling out new features to make a permanent home for Search Console. We expect many more GSC features to come this year so users can have more opportunities to boost our SEO.
Originally published on the AIS Media blog.
Site Search 360 Trends