"Issue Name","Issue Type","Issue Priority","URLs","% of Total","Description","How To Fix","Help URL" "H1: Multiple","Warning","Medium","4","0.150","Pages which have multiple

s. While this is not strictly an issue because HTML5 standards allow multiple

s on a page, there are some problems with this modern approach in terms of usability. It's advised to use heading rank (h1-h6) to convey document structure. The classic HTML4 standard defines there should only be a single

per page, and this is still generally recommended for users and SEO.","Consider updating the HTML to include a single

on each page, and utilising the full heading rank between (h2 - h6) for additional headings.","" "Page Titles: Over 60 Characters","Opportunity","Medium","247","9.540","Pages which have page titles that exceed the configured limit. Characters over this limit might be truncated in Google's search results and carry less weight in scoring.","Write concise page titles to ensure important words are not truncated in the search results, not visible to users and potentially weighted less in scoring.","" "Meta Description: Below 400 Pixels","Opportunity","Low","87","3.360","Pages which have meta descriptions much shorter than Google's estimated pixel length limit. This isn't necessarily an issue, but it does indicate there might be room to communicate benefits, USPs or call to actions.","Consider updating the meta description to take advantage of the space left to include additional benefits, USPs or call to actions to improve click through rates (CTR).","" "Page Titles: Missing","Issue","High","10","0.390","Pages which have a missing page title element, the content is empty, or has a whitespace. Page titles are read and used by both users and the search engines to understand what a page is about. They are important for SEO as page titles are used in rankings, and vital for user experience, as they are displayed in browsers, search engine results and on social networks.","It's essential to write concise, descriptive and unique page titles on every indexable URL to help users, and enable search engines to score and rank the page for relevant search queries.","" "Page Titles: Same as H1","Opportunity","Low","2179","84.200","Page titles which match the h1 on the page exactly. This is not necessarily an issue, but may point to a potential opportunity to target alternative keywords, synonyms, or related key phrases.","This is not necessarily an issue, but may point to a potential opportunity to target alternative keywords, synonyms, or related key phrases.","" "Page Titles: Over 561 Pixels","Opportunity","Medium","243","9.390","Pages which have page titles over Google's estimated pixel length limit for titles in search results. Google snippet length is actually based upon pixels limits, rather than a character length. The SEO Spider tries to match the latest pixel truncation points in the SERPs, but it is an approximation and Google adjusts them frequently.","Write concise page titles to ensure important words are not truncated in the search results, not visible to users and potentially weighted less in scoring.","" "Response Codes: Internal Blocked Resource","Warning","High","2302","31.210","Internal resources (such as images, JavaScript and CSS) that are blocked from rendering by robots.txt or an error. This filter will only populate when JavaScript rendering is enabled (blocked resources will appear under 'Blocked by Robots.txt' in default 'text only' crawl mode). This can be an issue as the search engines might not be able to access critical resources to be able to render pages accurately. Blocked resources can be viewed by URL in the 'Rendered Page' tab, and any pages with blocked resources can be viewed under 'JavaScript > Pages with Blocked Resources'.","Update the robots.txt and resolve any errors to allow all critical resources to be crawled and used for rendering of website content.","" "Meta Description: Over 155 Characters","Opportunity","Low","820","31.680","Pages which have meta descriptions over the configured limit. Characters over this limit might be truncated in Google's search results.","Write concise meta descriptions to ensure important words are not truncated in the search results, and not visible to users.","" "JavaScript: Page Title Updated by JavaScript","Warning","Medium","34","1.310","Pages that have page titles that are modified by JavaScript. This means the page title in the raw HTML is different to the page title in the rendered HTML.","While Google is able to render pages and see client-side only content, consider including important content server side in the raw HTML.","" "Structured Data: Rich Result Validation Errors","Issue","High","40","1.550","URLs that contain Google rich result feature validation errors. Google rich result feature validation will show errors for missing required properties or problems with the implementation of required properties. Google's 'required properties' must be included and be valid for content to be eligible for display as a rich result.","Resolve validation errors to ensure pages are eligible for special search result features in Google. Review the 'Structured Data Details' tab for more information on specific errors and refer to Google rich result feature docs where necessary. Export unique validation errors using 'Reports > Structured Data > Validation Errors & Warnings Summary' and test in Google's Rich Results tool.","" "Response Codes: Internal Client Error (4xx)","Issue","High","37","0.500","Internal URLs with a client-side error. This indicates a problem occurred with the URL request and can include responses such as 400 bad request, 403 Forbidden, 404 Page Not Found, 410 Removed, 429 Too Many Requests and more. A 404 'Page Not Found' is the most common, and often referred to as a broken link. View URLs that link to errors using the lower 'inlinks' tab and export them in bulk via 'Bulk Export > Response Codes > Internal > Client Error (4xx) inlinks'.","All links on a website should ideally resolve to 200 'OK' URLs. Errors such as a 404 or 410 should be updated to their correct locations, removed and redirected where appropriate. A 403 forbidden error occurs when a web server denies access to the SEO Spider's request and can often be resolved by switching the user-agent to Chrome via 'Config > User-Agent' and crawling again.","" "JavaScript: Pages with Blocked Resources","Warning","High","2616","100.000","Pages with resources (such as images, JavaScript and CSS) that are blocked from rendering by robots.txt or an error. This filter will only populate when JavaScript rendering is enabled (blocked resources will appear under 'Blocked by Robots.txt' in default 'text only' crawl mode). This can be an issue as the search engines might not be able to access critical resources to be able to render pages accurately. Blocked resources can be viewed by URL in the 'Rendered Page' tab, or in bulk under 'Response Codes > Blocked Resource'.","Update the robots.txt and resolve any errors to allow all critical resources to be crawled and used for rendering of the websites content. Resources that are not critical (e.g. Google Maps embed) can be ignored.","" "Meta Description: Below 70 Characters","Opportunity","Low","120","4.640","Pages which have meta descriptions below the configured limit. This isn't strictly an issue, but an opportunity. There is additional room to communicate benefits, USPs or call to actions.","Consider updating the meta description to take advantage of the space left to include additional benefits, USPs or call to actions to improve click through rates (CTR).","" "Response Codes: Internal Blocked by Robots.txt","Warning","High","2673","36.240","Internal URLs blocked by the site's robots.txt. This means they cannot be crawled and is a critical issue if you want the page content to be crawled and indexed by search engines. View URLs that link to URLs blocked by robots.txt using the lower 'inlinks' tab and export them in bulk via 'Bulk Export > Response Codes > Internal > Blocked by Robots.txt inlinks'.","Review URLs to ensure they should be disallowed. If they are incorrectly disallowed, then the site's robots.txt should be updated to allow them to be crawled. Consider whether you should be linking internally to these URLs and remove links where appropriate.","" "Content: Low Content Pages","Opportunity","Medium","82","3.170","Pages with a word count that is below the default 200 words. The word count is based upon the content area settings used in the analysis which can be configured via 'Config > Content > Area'. There isn't a minimum word count for pages in reality, but the search engines do require descriptive text to understand the purpose of a page. This filter should only be used as a rough guide to help identify pages that might be improved by adding more descriptive content in the context of the website and page's purpose. Some websites, such as ecommerce, will naturally have lower word counts, which can be acceptable if a products details can be communicated efficiently.","Consider including additional descriptive content to help the user and search engines better understand the page.","" "Security: Missing Content-Security-Policy Header","Warning","Low","2700","46.980","URLs that are missing the Content-Security-Policy response header. This header allows a website to control which resources are loaded for a page. This policy can help guard against cross-site scripting (XSS) attacks that exploit the browser's trust of the content received from the server. The SEO Spider only checks for existence of the header, and does not interrogate the policies found within the header to determine whether they are well set-up for the website. This should be performed manually.","Set a strict Content-Security-Policy response header across all page to help mitigate cross site scripting (XSS) and data injection attacks.","" "URL: Uppercase","Warning","Low","1","0.020","URLs that have uppercase characters within them. URLs are case sensitive, so as best practice generally URLs should be lowercase, to avoid any potential mix ups and duplicate URLs.","Ideally lowercase characters should be used for URLs only. However, changing URLs is a big decision, and often it's not worth changing them for SEO purposes alone. If URLs are changed, then appropriate 301 redirects must be implemented.","" "Security: HTTP URLs","Issue","High","72","1.250","HTTP URLs that are encountered in the crawl. All websites should be secure over HTTPS today on the web. Not only is it important for security, but it's now expected by users. Chrome and other browsers display a 'Not Secure' message against any URLs that are HTTP, or have mixed content issues (where they load insecure resources on them). To view how these URLs were discovered, view their 'inlinks' in the lower window tab. You can also export any pages that link to HTTP URLs via 'Bulk Export > Security > HTTP URLs Inlinks'.","All URLs should be to secure HTTPS pages. Pages should be served over HTTPS, any internal links should be updated to HTTPS versions and HTTP URLs should 301 redirect to HTTPS versions. HTTP URLs identified in this filter that are redirecting to HTTPS versions already should be updated to link to the correct HTTPS versions directly.","" "H2: Duplicate","Opportunity","Low","120","4.640","Pages which have duplicate

s. It's important to have distinct, unique and useful pages. If every page has the same

, then it can make it more challenging for users and the search engines to understand one page from another.","Update duplicate

s as necessary, so important pages contain a unique and descriptive

for users and search engines. If these are duplicate pages, then fix the duplicated pages by linking to a single version, and redirect or use canonicals where appropriate.","" "Security: Missing X-Content-Type-Options Header","Warning","Low","2700","46.980","URLs that are missing the 'X-Content-Type-Options' response header with a 'nosniff' value. In the absence of a MIME type, browsers may 'sniff' to guess the content type to interpret it correctly for users. However, this can be exploited by attackers who can try and load malicious code, such as JavaScript via an image they have compromised.","To minimise security issues, the X-Content-Type-Options response header should be supplied and set to 'nosniff'. This instructs browsers to rely only on the Content-Type header and block anything that does not match accurately. This also means the content-type set needs to be accurate.","" "JavaScript: H1 Only in Rendered HTML","Warning","Medium","162","6.260","Pages that contain an h1 only in the rendered HTML after JavaScript execution. This means a search engine must render the page to see it.","While Google is able to render pages and see client-side only content, consider including important content server side in the raw HTML.","" "H1: Missing","Issue","Medium","55","2.130","Pages which have a missing

, the content is empty or has a whitespace. The

should describe the main title and purpose of the page and are considered to be one of the stronger on-page ranking signals.","Ensure important pages have concise, descriptive and unique headings to help users, and enable search engines to score and rank the page for relevant search queries.","" "Security: Missing Secure Referrer-Policy Header","Warning","Low","2700","46.980","URLs missing 'no-referrer-when-downgrade', 'strict-origin-when-cross-origin', 'no-referrer' or 'strict-origin' policies in the Referrer-Policy header. When using HTTPS, it's important that the URLs do not leak in non-HTTPS requests. This can expose users to 'man in the middle' attacks, as anyone on the network can view them.","Consider setting a referrer policy of strict-origin-when-cross-origin. It retains much of the referrer's usefulness, while mitigating the risk of leaking data cross-origins.","" "Links: Pages With High External Outlinks","Warning","Low","2588","100.000","Pages that have a high number of followed external outlinks on them based upon the 'High External Outlinks' preferences under 'Config > Spider > Preferences'. External outlinks are hyperlinks to another subdomain or domain (depending on your configuration). This might be completely valid, such as linking to another part of the same root domain, or linking to other useful websites. External followed outlinks can be seen in the 'Outlinks' tab, with the 'All Link Types' filter set to 'Hyperlinks' where the 'Follow' column is 'True'.","Review followed external outlinks to ensure they are to credible, trusted and relevant websites that are useful to your users.","" "H1: Duplicate","Opportunity","Low","46","1.780","Pages which have duplicate

s. It's important to have distinct, unique and useful main headings. If every page has the same

, then it can make it more challenging for users and the search engines to understand one page from another.","Update duplicate

s as necessary, so important pages contain a unique and descriptive

for users and search engines. If these are duplicate pages, then fix the duplicated pages by linking to a single version, and redirect or use canonicals where appropriate.","" "Links: Internal Outlinks With No Anchor Text","Opportunity","Low","2588","100.000","Pages that have internal links without anchor text or images that are hyperlinked without alt text. Anchor text is the visible text and words used in hyperlinks that provide users and search engines context about the content of the target page. Internal outlinks without anchor text can be seen in the 'Outlinks' tab, with the 'All Link Types' filter set to 'Hyperlinks', where the 'Anchor Text' column is blank, or if an image, the 'Alt Text' column is also blank. Export in bulk via 'Bulk Export > Links > Internal Outlinks With No Anchor Text'.","Review the missing anchor text outlinks and where appropriate include useful and descriptive anchor text to help users and search engines.","" "H2: Over 70 Characters","Opportunity","Low","1087","42.000","Pages which have

s over the configured limit. There is no hard limit for characters in an

, however they should be clear and concise for users and long headings might be less helpful","Write concise

s for users, including target keywords where natural for users - without keyword stuffing.","" "Canonicals: Missing","Warning","Medium","52","2.010","Pages that have no canonical URL present either as a link element, or via HTTP header. If a page doesn't indicate a canonical URL, Google will identify what they think is the best version or URL. This can lead to ranking unpredictability when there are multiple versions discovered, and hence generally all URLs should specify a canonical version","Specify a canonical URL for every page to avoid any potential ranking unpredictability if multiple versions of the same page are discovered on different URLs.","" "Canonicals: Canonicalised","Warning","High","53","2.050","Pages that have a canonical to a different URL. The URL is 'canonicalised' to another location. This means the search engines are being instructed to not index the page, and the indexing and linking properties should be consolidated to the URL in the canonical.","These URLs should be reviewed carefully to ensure the indexing and link signals are being consolidated to the correct URL. In a perfect world, a website wouldn't need to canonicalise any URLs as only canonical versions would be linked to internally on a website, but often they are required due to various circumstances outside of control, and to prevent duplicate content. Update internal links to canonical versions of URLs where possible.","" "H1: Non-Sequential","Warning","Low","9","0.350","Pages with an

that is not the first heading on the page. Heading elements should be in a logical sequentially-descending order. The purpose of heading elements is to convey the structure of the page and they should be in logical order from

to

, which helps navigating the page and users that rely on assistive technologies.","Ensure the

is the first heading on the page. Headings should be in a logical sequential order from

to

. Review and update page heading levels so they are descending in order, for example the first heading level should be an

, and this should be followed by an

.","" "Content: Exact Duplicates","Issue","High","19","0.730","Pages that are identical to each other using the MD5 algorithm which calculates a 'hash' value for each page and can be seen in the 'hash' column. This check is performed against the full HTML of the page. It will show all pages with matching hash values that are exactly the same. Exact duplicate pages can lead to the splitting of PageRank signals and unpredictability in ranking.","There should only be a single canonical version of a URL that exists and is linked to internally. Other versions should not be linked to, and they should be 301 redirected to the canonical version.","" "H1: Over 70 Characters","Opportunity","Low","52","2.010","Pages which have

s over the configured length. There is no hard limit for characters in an

, however they should be clear and concise for users and long headings might be less helpful","Write concise

s for users, including target keywords where natural for users - without keyword stuffing.","" "Response Codes: Internal Redirection (3xx)","Warning","Low","306","4.150","Internal URLs which redirect to another URL. These will include server-side redirects, such as 301 or 302 redirects (and more). View URLs that link to redirects using the lower 'inlinks' tab and export them in bulk via 'Bulk Export > Response Codes > Internal > Redirection (3xx) inlinks'.","Ideally all internal links would be to canonical resolving URLs, and avoid linking to URLs that redirect. This reduces latency of redirect hops for users, and enhanced efficiency for search engines.","" "Response Codes: External Blocked Resource","Warning","Medium","14","0.190","External resources (such as images, JavaScript and CSS) that are blocked from rendering by robots.txt or an error. This filter will only populate when JavaScript rendering is enabled (blocked resources will appear under 'Blocked by Robots.txt' in default 'text only' crawl mode). This can be an issue as the search engines might not be able to access critical resources to be able to render pages accurately. Blocked resources can be viewed by URL in the 'Rendered Page' tab, and any pages with blocked resources can be viewed under 'JavaScript > Pages with Blocked Resources'.","If critical to your content, update the external subdomains robots.txt and resolve any errors to allow resources to be crawled and used for rendering of the websites content.","" "JavaScript: Contains JavaScript Content","Warning","Medium","2001","77.320","Pages that contain body text that's only discovered in the rendered HTML after JavaScript execution.","While Google is able to render pages and see client-side only content, consider including important content server side in the raw HTML.","" "Page Titles: Below 30 Characters","Opportunity","Medium","358","13.830","Pages which have page titles under the configured limit. This isn't necessarily an issue, but it does indicate there might be room to target additional keywords or communicate your USPs.","Consider updating the page title to take advantage of the space left to include additional target keywords or USPs.","" "JavaScript: Contains JavaScript Links","Warning","Medium","342","13.210","Pages that contain hyperlinks that are only discovered in the rendered HTML after JavaScript execution. These hyperlinks are not in the raw HTML.","While Google is able to render pages and see client-side only links, consider including important links server side in the raw HTML.","" "Directives: Nofollow","Warning","High","5","0.190","URLs containing a 'nofollow' directive in either a robots meta tag or X-Robots-Tag in the HTTP header. This is a 'hint' which tells the search engines not to follow any links on the page for crawling. This is generally used by mistake in combination with 'noindex', when there is no need to include this directive as it stops 'PageRank' from being passed onwards. To crawl pages with a nofollow directive within the SEO Spider, enable 'Follow Internal Nofollow' via 'Config > Spider'.","URLs with a 'nofollow' should be reviewed carefully to ensure that links shouldn't be crawled and PageRank shouldn't be passed on. If outlinks should be crawled and PageRank should be passed onwards, then the 'nofollow' directive should be removed.","" "Validation: High Carbon Rating","Opportunity","Low","1","0.020","URLs that have a carbon rating of F using the digital carbon ratings system from Sustainable Web Design. This scale equates page weight tracked by the HTTP Archive with CO2 estimates per page view. The CO2 calculation uses the 'The Sustainable Web Design Model' for calculating emissions, which considers datacentres, network transfer and device usage in calculations. The CO2 calculation and rating system can be used as a benchmark, as well as a catalyst to contribute to a more sustainable web.","Consider optimisation opportunities to reduce file size and CO2 emissions, and improve carbon footprint and rating. The CO2 calculation and carbon rating can be integrated with analytics data to help prioritise which areas or URLs should be focused upon. Review best practices and opportunities listed under the PageSpeed tab for more specific actions.","" "URL: Underscores","Opportunity","Low","58","1.010","URLs with underscores, which are not always seen as word separators by search engines.","Ideally hyphens should be used as word separators, rather than underscores. However, changing URLs is a big decision, and often it's not worth changing them for SEO purposes alone. If URLs are changed, then appropriate 301 redirects must be implemented.","" "URL: Non ASCII Characters","Warning","Low","11","0.190","URLs with characters outside of the ASCII character-set. Standards outline that URLs can only be sent using the ASCII character-set and some users may have difficulty with subtleties of characters outside this range.","URLs should be converted into a valid ASCII format, by encoding links to the URL with safe characters (made up of % followed by two hexadecimal digits). Today browsers and the search engines are largely able to transform URLs accurately.","" "Directives: Noindex","Warning","High","251","9.300","URLs containing a 'noindex' directive in either a robots meta tag or X-Robots-Tag in the HTTP header. This instructs the search engines not to index the page. The page will still be crawled (to see the directive), but it will then be dropped from the index.","URLs with a 'noindex' should be reviewed carefully to ensure they are correct and shouldn't be indexed. If these pages should be indexed, then the 'noindex' directive should be removed.","" "Response Codes: External Client Error (4xx)","Warning","Low","146","1.980","External URLs with a client-side error. This indicates a problem occurred with the URL request and can include responses such as 400 bad request, 403 Forbidden, 404 Page Not Found, 410 Removed, 429 Too Many Requests and more. A 404 'Page Not Found' is the most common, and often referred to as a broken link. View URLs that link to errors using the lower 'inlinks' tab and export them in bulk via 'Bulk Export > Response Codes > External > Client Error (4xx) inlinks'.","All links on a website should ideally resolve to 200 'OK' URLs. Errors such as 404 broken links should be updated so users are taken to the correct URL, or removed. A 403 forbidden error occurs when a web server denies access to the SEO Spider's request and can often be resolved by switching the user-agent to Chrome via 'Config > User-Agent'. If they can be viewed in a browser, then it's often not an issue.","" "Structured Data: Rich Result Validation Warnings","Opportunity","Low","1033","39.910","URLs that contain Google rich result feature validation warnings. These will always be for 'recommended properties', rather than required properties. Recommended properties can be included to add more information about content which could provide a better user experience, but they do not disqualify you from being eligible for rich snippets.","Consider warnings as 'opportunities' to improve the data provided to search engines and users to potentially enhance special search result features in Google. Review the 'Structured Data Details' tab for more information on specific warnings and refer to Google rich result feature docs where necessary. Export unique validation errors using 'Reports > Structured Data > Validation Errors & Warnings Summary'.","" "Security: Missing HSTS Header","Warning","Low","71","1.240","URLs that are missing the HSTS response header. The HTTP Strict-Transport-Security response header (HSTS) instructs browsers that it should only be accessed using HTTPS, rather than HTTP. If a website accepts a connection to HTTP, before being redirected to HTTPS, visitors will initially still communicate over HTTP. The HSTS header instructs the browser to never load over HTTP and to automatically convert all requests to HTTPS.","The HSTS header should be used across all pages to instruct the browser that it should always request pages via HTTPS, rather than HTTP.","" "Meta Description: Missing","Opportunity","Low","71","2.740","Pages which have a missing meta description, the content is empty or has a whitespace. This is a missed opportunity to communicate the benefits of your product or service and influence click through rates for important URLs.","It's important to write unique and descriptive meta descriptions on key pages to communicate the purpose of the page to users, and entice them to click on your result over the competition. It can also mean Google use this description for snippets in the search results for some queries, rather than make up their own based upon the content of the page.","" "Security: Unsafe Cross-Origin Links","Warning","Low","728","12.670","URLs that link to external websites using the target=""_blank"" attribute (to open in a new tab), without using rel=""noopener"" (or rel=""noreferrer"") at the same time. Using target=""_blank"" alone leaves those pages exposed to both security and performance issues for some legacy browsers, which are estimated to be below 5% of market share. Setting target=""_blank"" on elements implicitly provides the same rel behavior as setting rel=""noopener"" which does not set window.opener for most modern browsers, such as Chrome, Safari, Firefox and Edge. The external links that contain the target=""_blank"" attribute by itself can be viewed in the 'outlinks' tab and 'target' column. They can be exported alongside the pages they are linked from via 'Bulk Export > Security > Unsafe Cross-Origin Links'.","Consider the benefits of including the rel=""noopener"" link attribute on any links that contain the target=""_blank"" attribute to avoid security and performance issues for the users of legacy browsers that may visit the website.","" "Page Titles: Below 200 Pixels","Opportunity","Medium","136","5.260","Pages which have page titles much shorter than Google's estimated pixel length limit. This isn't necessarily an issue, but it does indicate there might be room to target additional keywords or communicate your USPs.","Consider updating the page title to take advantage of the space left to include additional target keywords or USPs.","" "URL: Parameters","Warning","Low","55","0.960","URLs that include parameters such as '?' or '&'. This isn't an issue for Google or other search engines to crawl unless at significant scale, but it's recommended to limit the number of parameters in a URL which can be complicated for users, and can be a sign of low value-add URLs.","Where possible use a static URL structure without parameters for key indexable URLs. However, changing URLs is a big decision, and often it's not worth changing them for SEO purposes alone. If URLs are changed, then appropriate 301 redirects must be implemented.","" "Meta Description: Duplicate","Opportunity","Low","79","3.050","Pages which have duplicate meta descriptions. It's really important to have distinct and unique meta descriptions that communicate the benefits and purpose of each page. If they are duplicate or irrelevant, then they will be ignored by search engines in their snippets.","Update duplicate meta descriptions as necessary, so important pages contain a unique and descriptive title for users and search engines. If these are duplicate pages, then fix the duplicated pages by linking to a single version, and redirect or use canonicals where appropriate.","" "Response Codes: External Server Error (5xx)","Warning","Low","7","0.090","External URLs where the server failed to fulfill an apparently valid request. This can include common responses such as 500 Internal Server Errors and 503 Service Unavailable. View URLs that link to errors using the lower 'inlinks' tab and export them in bulk via 'Bulk Export > Response Codes > External > Server Error (5xx) inlinks'.","All URLs should respond with a 200 'OK' status and this might indicate a server that struggles under load, or a misconfiguration that requires investigation. If they can be viewed in a browser, then it's often not an issue.","" "Security: Missing X-Frame-Options Header","Warning","Low","2700","46.980","URLs missing an X-Frame-Options response header with a 'DENY' or 'SAMEORIGIN' value. This instructs the browser not to render a page within a frame, iframe, embed or object. This helps avoid 'clickjacking' attacks, where your content is displayed on another web page that is controlled by an attacker.","To minimise security issues, the X-Frame-Options response header should be supplied with a 'DENY' or 'SAMEORIGIN' value.","" "URL: GA Tracking Parameters","Warning","Low","1","0.020","URLs that contain Google Analytics tracking parameters. In addition to creating duplicate pages that must be crawled, using tracking parameters on links internally can overwrite the original session data. utm= parameters strip the original source of traffic and starts a new session with the specified attributes. _ga= and _gl= parameters are used for cross-domain linking and identify a specific user, including this on links prevents a unique user ID from being assigned.","Remove the tracking parameters from links. Event Tracking is recommended in place of utm parameters for tracking additional interactions on a page such as downloads, link clicks, form submissions, and video plays.","" "H2: Multiple","Warning","Low","214","8.270","Pages which have multiple

s. This is not an issue as HTML standards allow multiple

's when used in a logical hierarchical heading structure. However, this filter can help you quickly scan to review if they are used appropriately.","Ensure

s are used in a logical hierarchical heading structure, and update where appropriate utilising the full heading rank between (h3 - h6) for additional headings.","" "Canonicals: Non-Indexable Canonical","Issue","High","47","1.820","Pages with a canonical URL that is non-indexable. This will include canonicals which are blocked by robots.txt, no response, redirect (3XX), client error (4XX), server error (5XX), are 'noindex' or 'canonicalised' themselves. This means the search engines are being instructed to consolidate indexing and link signals to a non-indexable page, which often leads to them ignoring the canonical, but may also lead to unpredictability in indexing and ranking. Export pages, their canonicals and status codes via 'Reports > Canonicals > Non-Indexable Canonicals'.","Ensure canonical URLs are to accurate indexable pages to avoid them being ignored by search engines, and any potential indexing or ranking unpredictability.","" "Structured Data: Missing","Opportunity","Low","52","2.010","URLs that do not contain any structured data. This is a potential opportunity to provide explicit clues about the meaning of pages and enable special search result features and enhancements in Google. Structured data will only be discovered if JSON-LD, microdata and RDFa formats are selected for extraction via 'Config > Spider > Extraction'.","Review pages and consider appropriate types of structured data that provide the search engines with a better understanding of the page and Google's search feature gallery (https://developers.google.com/search/docs/guides/search-gallery) for opportunities to enable special search result features and enhancements.","" "Links: Internal Nofollow Outlinks","Warning","Low","3","0.120","Pages that use rel=""nofollow"" on internal outlinks. Links with nofollow link attributes will generally not be followed by search engines. Remember that the linked pages may be found through other means, such as other followed links, or XML Sitemaps etc. Nofollow outlinks can be seen in the 'Outlinks' tab with the 'All Link Types' filter set to 'Hyperlinks', where the 'Follow' column is 'False'. Export in bulk via 'Bulk Export > Links > Internal Nofollow Outlinks'.","Review the use of rel=""nofollow"" on internal links. These might be valid to URLs that ideally wouldn't be crawled, or they could be by mistake. Remove the nofollow link attribute to important URLs you wish to be crawled, indexed and receive PageRank.","" "Response Codes: External No Response","Warning","Low","41","0.560","External URLs with no response returned from the server. Usually due to a malformed URL, connection timeout, connection error, or connection refused. View URLs that link to no responses using the lower 'inlinks' tab and export them in bulk via 'Bulk Export > Response Codes > External > No Response inlinks'.","Malformed URLs should be updated to the correct location and other connection issues can often be resolved by using different user-agents ('Config > User-Agent'), adjusting the crawl speed ('Config > Speed') or disabling firewalls & proxies. If they can be viewed in a browser, then it's often not an issue.","" "Page Titles: Duplicate","Opportunity","Medium","66","2.550","Pages which have duplicate page titles. It's really important to have distinct and unique page titles for every page. If every page has the same page title, then it can make it more challenging for users and the search engines to understand one page from another.","Update duplicate page titles as necessary, so each page contains a unique and descriptive title for users and search engines. If these are duplicate pages, then fix the duplicated pages by linking to a single version, and redirect or use canonicals where appropriate.","" "Response Codes: Internal No Response","Issue","High","31","0.420","Internal URLs with no response returned from the server. Usually due to a malformed URL, connection timeout, connection error, or connection refused. View URLs that link to no responses using the lower 'inlinks' tab and export them in bulk via 'Bulk Export > Response Codes > Internal > No Response inlinks'.","Malformed URLs should be updated to the correct location and other connection issues can often be resolved by using different user-agents ('Config > User-Agent'), adjusting the crawl speed ('Config > Speed') or disabling firewalls & proxies.","" "Meta Description: Over 985 Pixels","Opportunity","Low","750","28.980","Pages which have meta descriptions over Google's estimated pixel length limit for snippets. Google snippet length is actually based upon pixels limits, rather than a character length. The SEO Spider tries to match the latest pixel truncation points in the SERPs, but it is an approximation and Google adjusts them frequently.","Write concise meta descriptions to ensure important words are not truncated in the search results, and not visible to users.","" "Security: Protocol-Relative Resource Links","Warning","Low","2533","44.080","URLs that load resources such as images, JavaScript and CSS using protocol-relative links. A protocol-relative link is simply a link to a URL without specifying the scheme (for example, //screamingfrog.co.uk). It helps save developers time from having to specify the protocol and lets the browser determine it based upon the current connection to the resource. However, this technique is now an anti-pattern with HTTPS everywhere, and can expose some sites to 'man in the middle' compromises and performance issues","Update any resource links to be absolute links including the scheme (HTTPS) to avoid security and performance issues.","" "H2: Missing","Warning","Low","35","1.350","Pages which have a missing

, the content is empty or has a whitespace. The

heading is often used to describe sections or topics within a document. They act as signposts for the user, and can help search engines understand the page.","Consider using logical and descriptive

s on important pages that help the user and search engines better understand the page.","" "URL: Over 115 Characters","Opportunity","Low","36","0.630","URLs that are more than the configured length. This is generally not an issue, however research has shown that users prefer shorter, concise URL strings.","Where possible use logical and concise URLs for users and search engines. However, changing URLs is a big decision, and often it's not worth changing them for SEO purposes alone. If URLs are changed, then appropriate 301 redirects must be implemented.","" "URL: Internal Search","Warning","Low","1","0.020","URLs that might be part of the websites internal search function. Google and other search engines recommend blocking internal search pages from being crawled to limit sometimes duplicate and low quality pages from being crawled and indexed.","Most internal site searches are built for users, rather than search engines who may needlessly crawl and index them. As best practise most search sections of a site should not be linked to internally, and should be disallowed in robots.txt.",""