Anybody who has built, or had a website built for them wants to drive as much ‘free’ traffic from search engines as possible. By improving the ranking of the site with search engines such as Google and Bing, the number of clicks from the organic search results can be very lucrative, however, it is not easy to gain traction without plenty of work upfront. The starting point should always be to check whether a website is search engine friendly. In technical speak, can the search engines crawl, read and index resources on your website? In this respect, resources include; text, images and videos.
How can you ensure that your website can be crawled?
Step one is to check if your website is already indexed. You can do this by completing a search in Google with the following search operator, site:yourdomain.com. If the results include pages from your website then the site has been indexed, however, this doesn’t mean all pages have been.
If your website hasn’t been indexed you will see the following results.
Implementing a robots.txt file is not a requirement for indexing, however, the information in the file ensures that search engines know what pages (and other resources) can be crawled.
Disallow: [URL string not to be crawled]
Sitemap: Your XML sitemap URL
You can also use the robots.txt to tell the search engine crawlers which pages not to crawl. You should be careful not to block important/key pages, which could happen if the rules are not correctly set up.
The robots.txt file should always be placed in the top-level directory so the URL will be along the lines of; yourdomain.com/robot.txt.
Implementing an XML sitemap is like give the search engine crawlers an A to Z of your website, pointing out the key pages. The XML file should only include pages that return a 200 status, this means that the page doesn’t redirect to another page or returns a broken page (more on this below).
There are a number of online tools that create XML sitemaps including XML-Sitemap.com.
Once the XML file has been created it ideally should be added to the top-level directory, to create the following URL mydomain.com/sitemap.xml. This URL should be included in the robots.txt file and submitted to Google’s Search Console and Bing Webmaster Tools. We will discuss Google Search Console and Bing Webmaster Tools in more detail later.
Broken web pages/images
One issue that can affect how search engine friendly your website is, is the number of broken pages or images. If the search engine’s crawler hits a broken link it will then be unable to navigate further and could miss key pages that should be crawled and indexed.
Broken webpages and image are also bad for UX (User Experience), resulting in a loss in conversions/revenue.
To check your website for broken links, try out our site audit tool below.
Meta tags are used on every page of a website to indicate to both users and search engines what the main topic of the page is. Not all of the tags are visible on the page, however, they can be included in organic ads shown on the search engine results pages (SERPs), which is a key factor for users decide which result/ad to click on.
Meta tags include the following
Page Title Tag
This tag is key to what keywords the search engines rank a particular page for. Users can see this tag in the browser tab, the screenshot below is an example for RDM’s website.
The page title tag also appears as the headline in the organic results, which has already mentioned will affect the number of clicks from the results page.
When creating this tag, the focus keyword or topic should be included, ideally as close to the start of the title as possible. There should be no duplicate page titles across your website as this will confuse the search engines as to which page should rank for a certain topic/keyword. The character limits for this tag is 60.
Although the page description is not a ranking factor the content placed in this tag makes up the body of the organic ad on the SERPs. Using a relevant and enticing page description will help increase clicks from the results, driving more free traffic.
Meta Robots Attribute
This tag is used to instruct the search engine crawler how to crawl and index a particular page. There are a number of parameters that can be used within this tag, but the main examples ate ‘Noindex’ and ‘Index’. Using the ‘Index’ parameter (<meta name=“robots” content=“index, follow”>), this indicates to the crawler that your page should be indexed, and any links followed (passing ‘link juice’). If the parameter is set to ‘noindex’ then the page will be ignored by the search engines.
Image Alt tags
The alt tag, also known as the ‘alt attribute’ is applied to images on a page to indicate the content of an image, providing a text alternative for search engines. The alt tag is added with the HTML of the page and is not visible to users. As this tag is a ranking factor, you should ensure that the content in this tag is relevant to the topic/focus keyword of the page.
Header tags are used on a page to indicate the heading (h1) and sub-headings (h2 to h6) within the content. The h1 tag should be as close to the top of the page as possible, with the sub-headings cascading down the content, h1 > h2 > h3. There should only be one h1 tag, but you can use multiple h2 to h6 tags as long as they are in order.
Google Search Console and Bing Webmaster Tools
Implementing both Google Search Console and Bing Webmaster Tools gives you the opportunity to submit the XML site map directly to the search engines. Both tools flag up issues with the XML sitemap and the pages on your website. The screenshot below shows that on the RDM site there are 78 valid URLs, with no errors.
We will write a guide on using Search Console and Bing Webmaster Tools in the near future.
Get a Free Audit of Your Website
How to Avoid Disaster When Your Search Rankings Fail
You’ve spent a huge amount of time, resources and effort in getting your website into the top organic search results for your chosen keywords and phrases. You wake up one morning and realise that traffic to your site has fallen significantly along with your rankings. Is there a quick fix? Should you panic? Before you start, check these factors in our quick guide first!
Want to Know Why Your Website is Not Converting?
You might have a large amount of traffic being driven to your website but, what happens when the visitors on your website are simply not converting into leads and sales? There can be a variety of reasons so we’re going to help you with a few of our top tips for Conversion Rate Optimisation (CRO).
Welcome to the Dark Side...of Social Media
That’s right, we’re not joining Darth Vader over on the dark side just yet but we do want to tell you about the dark side of social media. Don’t worry, it’s not an evil side or something you should worry about but it’s definitely something you should know about!
The Importance of User-Experience for Your E-Commerce Website
As you can see a great experience on your website is more likely to help you convert website visitors into customers. By providing customers with the information they’re looking for quickly and easily, you’re able to lead them further through your sales process, increasing your sales and revenue. Have you ever been to a website and found it confusing or have been unable to find what you’re looking for so you moved on to find a better website? This is also what your customers are doing!