Having recently launched a new website, you can be forgiven for sitting back and taking a much deserved break. All your hard work creating unique content, gathering relevant imagery, and liaising with your web designer have all come to pass and now you have a great looking website that showcases your business to the world.
Or does it?
Getting a new website found in search result can prove a little tricky. That’s why we’ve created this article to help ensure that you and your website get off to the best start. Below are the four most important steps to take with your website after it is launched, and they give you the best chance of being indexed quickly and correctly.
1 – Google Analytics
By far and above the most important tool for website management in 2018.
Google Analytics is a powerful website tracking tool that has a range of uses, from determining how many visitors your website gets, to figuring out which page they most frequently leave your website from. It is a free tool from Google which you can utilise simply by signing up here, and providing your website developer with the relevant tracking information.
2 – Search Console
Formally known as Google Webmaster Tools,
Search Console is a vital tool that can help you see exactly what information Google knows about your website. It can also help new websites get indexed (found in Google search) as well as highlighting any potential issues affecting the website. Similar to Google Analytics, this free tool can be utilised simply by signing up here and having your website developer verify your website with a Meta Tag.
3 – Sitemap.xml
A sitemap.xml is like a roadmap for search engines like Google. Essentially when a search engine bot visits your website, your sitemap helps direct them to pages across the website to find the information on them. To see if your website has a sitemap.xml simply put
/sitemap.xml after your URL in the address bar and push enter. If your website returns something that looks
like this then you are ready to submit it to Google through Search Console.
4 – Robots.txt
If a sitemap.xml is a roadmap, then a robots.txt is a stop sign that tells search engine bots where not to visit. A very simple and useful website addition to help stop search engine bots from getting lost in sections of your website that you don’t want to be indexed (your administration area for example). Checking if your website has a robots.txt is same as checking for your sitemap.xml, simply put /
robots.txt at the end of your website URL in your address bar and if
it looks something like this then you’re ready to submit it through Google Search Console.