Site Audit (Details)
|
Security Risk Specialists London | Your Security Matters
You've done a great job! Your website is free from 4xx errors.
4xx errors often point to a problem on a website. For example, if you have a broken link on a page, and visitors click it, they may see a 4xx error. It's important to regularly monitor and fix these errors, because they may have negative impact and lower your site's authority in users' eyes.
Some of your resources return 5xx status codes.
5xx error messages are sent when the server has a problem or error. It's important to regularly monitor these errors and investigate their causes, because they may have negative impact and lower the site's authority in search engines' eyes.
Good job! Your site's 404 error page is set up correctly.
A custom 404 error page can help you keep users on the website. In a perfect world, it should inform users that the page they are looking for doesn't exist, and feature such elements as your HTML sitemap, the navigation bar and a search field. But more importantly, a 404 error page should return the 404 response code. This may sound obvious, but unfortunately it's rarely so.
According to Google Search Console:
"Returning a code other than 404 or 410 for a non-existent page... can be problematic. Firstly, it tells search engines that there's a real page at that URL. As a result, that URL may be crawled and its content indexed. Because of the time Googlebot spends on non-existent pages, your unique URLs may not be discovered as quickly or visited as frequently and your site's crawl coverage may be impacted. We recommend that you always return a 404 (Not found) or a 410 (Gone) response code in response to a request for a non-existing page."
A robots.txt file is not available on your website. Create a valid robots.txt file with instructions for the search engine bots.
The robots.txt file is automatically crawled by robots when they arrive at your website. This file should contain commands for robots, such as which pages should or should not be indexed. If you want to disallow indexing of some content (for example, pages with private or duplicate content), just use an appropriate rule in the robots.txt file. For more information on such rules, check out http://www.robotstxt.org/robotstxt.html.
Please note that commands placed in the robots.txt file are more like suggestions rather than absolute rules for robots to follow. There's no guarantee that some robot will not check the content that you have disallowed.
An .xml sitemap is not found on your website. Create a valid sitemap and make sure it gets indexed by the search engines.
An XML sitemap should contain all of the website pages that you want to be indexed, and should be located on the website one directory structure away from the homepage (ex. http://www.site.com/sitemap.xml). In general, it serves to aid indexing. You should update it each time you add new pages to your website. Besides, the sitemap should follow particular syntax.
The sitemap allows you to set the priority of each page, telling search engines which pages they are supposed to crawl more often (i.e. they are more frequently updated). Learn how to create an .xml sitemap at http://www.sitemaps.org/.
Some of your site's resources are restricted from indexing. It is recommended to re-check the robots.txt file, and make sure that all your useful content gets indexed and is not blocked by mistake.
A resource can be restricted from indexing in several ways:
Good job! www and non-www versions on your website have been merged.
Usually websites are available with and without "www" in the domain name. Merging both URLs will help you prevent search engines from indexing two versions of a website.
Although the indexing of both versions won't cause a penalty, setting one of them as a priority is a best practice, in part because it helps funnel the SEO value from links to one common version. You can look up or change your current primary version in the .htaccess file. Also, it is recommended to set the preferred domain in Google Search Console.
Good job! No 302 redirects have been found on your website.
302 redirects are temporary, so they don't pass any link juice. If you use them instead of 301s, search engines may continue to index the old URLs, and disregard the new ones as duplicates. Or they may divide the link popularity between the two versions, thus hurting search rankings. That's why it is not recommended to use 302 redirects if you are permanently moving a page or a website. Stick to a 301 redirect instead to preserve link juice and avoid duplicate content issues.
There are 301 redirects found on your website. Check all your 301 redirects and make sure they point to relevant pages and are set up correctly.
301 redirects are permanent and are usually used to solve problems with duplicate content or to redirect certain URLs that are no longer necessary. The use of 301 redirects is absolutely legitimate, and it's good for SEO because a 301 redirect will funnel link juice from the old page to the new one. Just make sure you redirect old URLs to the most relevant pages.
Well done! No Meta refresh redirects were found on your website.
Basically, Meta refresh may be seen as a violation of Google's Quality Guidelines and therefore is not recommended from the SEO point of view. As one of Google's representatives points out: "In general, we recommend not using meta-refresh type redirects, as this can cause confusion with users (and search engine crawlers, who might mistake that for an attempted redirect)... This is currently not causing any problems with regards to crawling, indexing, or ranking, but it would still be a good idea to remove that." So stick to the permanent 301 redirect instead.
There are pages with canonical URLs specified for them on your website. Please, make sure that your rel="canonical" tags or rel="canonical" HTTP headers are set up correctly.
In most cases duplicate URLs are handled via 301 redirects. However sometimes, for example when the same product appears in two categories with two different URLs and both need to be live, you can specify which page should be considered a priority with the help of rel="canonical" tags. It should be correctly implemented within the <head> tag of the page and point to the main page version that you want to rank in search engines. Alternatively, if you can configure your server, you can indicate the canonical URL using rel="canonical" HTTP headers.
Well done! Your site's homepage is mobile-friendly.
According to Google, the mobile-friendly algorithm affects mobile searches in all languages worldwide and has a significant impact in Google's search results. This algorithm works on a page-by-page basis - it is not about how mobile-friendly your pages are, it is simply are you mobile-friendly or not.
The algo is based on such criteria as small font sizes, tap targets/links, readable content, your viewpoint, etc.
Good job! None of your website's pages have multiple canonical URLs.
In case of multiple rel="canonical" declarations, Google will likely ignore all the rel=canonical hints, so your effort to avoid duplicate content issues may go useless.
Well done! Your website pages are free from Frames.
Frames allow displaying more than one HTML document in the same browser window. As a result, text and hyperlinks (the most important signals for search engines) appear missing from such documents. If you use Frames, search engines will fail to properly index your valuable content.
Some pages on your site have size that's bigger than 3MB. Review these pages and decrease their size if possible.
If you have pages that are too big, this can influence user experience and even search engine rankings; so think about reducing the size of such pages and make them load faster.
Naturally, there's a direct correlation between the size of the page and its loading speed, which, in turn, is one of the numerous ranking factors. Basically, heavy pages load longer. That's why the general rule of thumb is to keep your page size up to 3MB. Of course, it's not always possible. For example, if you have an e-commerce website with a large number of images, you can push this up to more MBs, but this can significantly impact page loading time for users with a slow connection speed.
Dynamic URLs have been found on your website. Please, see if you can fix them.
URLs that contain dynamic characters like "?", "_" and parameters are not user-friendly because they are not descriptive and are harder to memorize. To increase your pages' chances to rank, it's best to setup URLs so that they would be descriptive and include keywords, not numbers or parameters. As Google Webmaster Guidelines state, "URLs should be clean coded for best practice, and not contain dynamic characters."
Too long URLs were found on your website. Please, see the pages with too long URLs and think whether you would like to shorten them.
URLs shorter than 115 characters are easier to read by end users and search engines, and will work to keep the website user-friendly.
There are some broken outgoing links on your website. This may result in poor user experience and signal to search engines that your site is neglected. Look through those links and fix them.
Broken outgoing links can be a bad quality signal to search engines and users. If a site has many broken links, they conclude that it has not been updated for some time. As a result, the site's rankings may be downgraded.
Although 1-2 broken links won't cause a Google penalty, try to regularly check your website, fix broken links (if any), and make sure their number doesn't go up. Besides, users will like your website more if it doesn't show them broken links pointing to non-existing pages.
Well done! There are no pages on your site with more than 100 outgoing links.
According to Matt Cutts (former head of Google's Webspam team), "...there's still a good reason to recommend keeping to under a hundred links or so: the user experience. If you're showing well over 100 links per page, you could be overwhelming your users and giving them a bad experience. A page might look good to you until you put on your "user hat" and see what it looks like to a new visitor." Although Google keeps talking about users experience, too many links on a page can also hurt your rankings. So the rule is simple: the fewer links on a page, the fewer problems with its rankings. So try to stick to the best practices and keep the number of outgoing links (internal and external) up to 100.
Well done! All your site's pages have a <title> tag, and all title tags contain content.
If a page doesn't have a title, or the title tag is empty (i.e. it just looks like this in the code: <title></title>), Google and other search engines will decide for themselves which text to show as your page title in their SERP snippets. Thus, you'll have no control what people see on Google when they find your page.
Therefore, every time you are creating a webpage, don't forget to add a meaningful title that would also be attractive to users.
Congratulations! All of your website pages have unique titles.
A page title is often treated as the most important on-page element. It is a strong relevancy signal for search engines, because it tells them what the page is really about. It is, of course, important that title includes your most relevant keyword. But more to that, every page should have a unique title to ensure that search engines have no trouble in determining which of the website pages is relevant for a query. Pages with duplicate titles have fewer chances to rank high. Even more, if your site has pages with duplicate titles, this may negatively influence other pages' rankings, too.
Some of your titles are longer than 55 characters. Review and rewrite them.
Every page should have a unique, keyword-rich title. At the same time, you should try to keep title tags concise. Titles that are longer than 55 characters get truncated by search engines and will look unappealing in search results. Even if your pages rank on page 1 in search engines, yet their titles are shortened or incomplete, they won't attract as many clicks as they would have driven otherwise.
Some of your pages do not have meta descriptions. Review those pages and create meta descriptions where necessary.
Although meta descriptions don't have direct influence on rankings, they are still important while they form the snippet people see in search results. Therefore, it should "sell" the webpage to the searcher and encourage them to click through.
If the meta description is empty, search engines will decide for themselves what to include into a snippet.
Congratulations! All of your website pages have unique descriptions.
According to Matt Cutts, it is better to have unique meta descriptions and even no meta descriptions at all, than to show duplicate meta descriptions across your pages. Hence, make sure that your top-important pages have unique and optimized descriptions.
Good job! All of your meta descriptions are within the required length.
Although meta descriptions don't have direct effect on rankings, they are still important while they form the snippet people see in search results. Therefore, descriptions should "sell" the webpage to the searchers and encourage them to click through. If the meta description is too long, it'll get cut by the search engine and may look unappealing to users.
Report created: Mar 16, 2017 by Astutium Ltd
|