Search Engine Optimisation

Why every website needs search engine marketing

In modern day and age, existing and brand new websites are expected to pay a lot more attention to every detail of online marketing. One key aspect of making sure your website gets noticed is a search engine marketing campaign.

Search Engine Optimization (SEO) is a strategy for optimizing your website content and structure along with external linking to the most important pages. By focusing on relevant keywords and creating pages in a search engines friendly way you can greatly increase natural traffic coming from search engines like Google, Yahoo and Bing.

SEO Healthcheck

First steps first: to establish a baseline it’s important to do a basic review of your website.

SEO Healthcheck is not only a review of your website but also a report of issues along with optimization steps for your SEO team to follow.

If you don’t have a technical resource to apply all the recommendations, I’ll happily work with you to make a plan and agree on the delivery timeline. Naturally, you will get a discount because of the SEO Healtcheck.

On-site Optimization

If you’re serious about online marketing and promoting your content with search engines, you’ll want all your pages optimized as best as possible. While content promotion campaign usually involves a complex approach, on-site SEO improvements are usually one of the first and necessary steps.

Keyword seeding

In addition to SEO healthcheck I offer keyword seeding – this is effectively altering your existing copy to include specific keywords that may help you get more relevant search engine traffic.

Google Analytics

Google Analytics is a very powerful platform for monitoring your website’s popularity.

Among many other things, Google Analytics give you insight into the number of daily visitors to your website and the ways they have arrived at your specific pages. Most obvious ways to use Gooogle Analytics include reviewing the most popular pages and analysing search terms that visitors used to reach your content. Such a knowledge helps you to decide whether it’s time to update content or add more pages which would target desired keywords.

Standalone Google Analytics Setup

If all you want is to create a Google Analytics account and have your websites monitored, the procedure is quite easy and I’ll guide you through every step.

Once setup is complete, you’ll retain full and sole access to your Google Analytics account and therefore all the insights about your websites.

Should you later decide to get my help with optimizing pages, I can guide you though a shared Analytics setup whereby you’ll provide me with access to only sections of Google Analytics that you believe to be relevant to the SEO project we’ll be working on.

Shared Google Analytics Setup

Should you decide that it would help more if I stay engaged in your SEO campaign for a few months, I’ll help you set up Google Analytics in a shared way.

There are two possible ways: I retain full visibility to your website’s analytics data and provide you with a similar access under your own credentials (login details), or you share only specific aspects of analytics while keeping ownership and access to the fullest set of analytics metrics available.

Google Webmaster Tools

Google Webmaster Tools is the number one stop for your website’s technical maintainer if you plan on maximising your benefits from Google’s search engine results.

Webmaster Tools are particularly useful for getting your new websites indexed – although they do not guarantee a speedy index insertion, Webmaster Tools allow you to submit a sitemap file for your website – essentially a structured description of every page along with indication of how important each page should be.

I’ll help you get your Webmaster Tools setup and your sitemap created and submitted. Bear in mind that although it’s quite common for blogging software and various Content Management Systems (CMS) to automatically build sitemap files for you, sometimes that is not the case. I will still be able to assist and get your sitemap created but the process will be a tad more tedious.

Site Indexation with Robots.txt

Search Engines use special automated tasks called web spiders (or web robots) for browsing through your website in order to index all the content and make it searchable. The process of going through your website’s pages, identifying indexable information and adding your website’s pages to a search engine index is called crawling. Web robots crawl through your website quite regularly – some many times a day.

Although most search engine robots are fairly smart these days, they do need guidance from time to time and a special file called robots.txt is a medium to do just that.

By updating robots.txt it is possible to block certain parts of your website from getting indexed (in case you want to prevent some data from being publicly searchable or if you want to avoid duplicate content on your website). It is also possible to make certain robots completely ignore your website (block them off, essentially) or simply slow down the most aggressive robots (some can be crawling your website at a rate of hundreds of pages a minute).