SEO (Search Engine Optimization) can be a little confusing when you hear the term for the first time. Oftentimes, someone may come into your office and tell you that your site needs SEO or you might receive a cold email promising to “SEO your website” for $99.00 (which I wouldn’t recommend signing up for). SEO is the process of optimizing your website to make it more user-friendly (to visitors) and rank higher in the search engines. We want to optimize our websites so that we get more traffic to a site, and as a result, more conversions. So, how do we do this? Let’s go over the basics of search engine optimization and how we can implement these changes on our site.

Keywords

Keywords have changed in their usage in SEO over the years. Keywords used to be the primary metric whereby site performance was judged (how many keywords did you rank for and what were your rankings). You’ll still see people in the industry using this metric. However, keywords mean nothing without having an optimized site that can convert a visitor into a customer. In addition, keywords have shifted to more long-tail usage (see this post on keyword research).

Keywords are how people will find your site. You should be analyzing how people are searching for your site, niche, or industry, and then create content that answers those questions. By doing this, you’ll be more likely to keep a visitor on your site.

magnifying glass image

 

On-Page SEO

On-page SEO refers to the optimizing of title tags, meta descriptions, images, header tags, and robots directives. Title tags should be under 72 characters and should describe the topic of the page. They should also include one set of branding. Meta descriptions can be up to 320 characters in length and should describe what someone would expect to find if they clicked on your link. Images should have keyword descriptive alt text (for screen readers), custom file names, and be appropriately sized.

Header tags divide up your page into sections. Each of these sections is a clue for the search engines to understand the structure of your site. These header tags should be using your targeted keywords for the topic of the post/page. Robots directives include canonical, nofollow, and noindex (to name a few). Canonical is a best practice because it tells the search engines which version of a page is the original. This is particularly helpful when you have pages that display results in a sorted fashion. You don’t need every single variation of a page indexed, because each is then low value and doesn’t really serve the end user. A canonical is also helpful with plagiarized content. If someone copies all of the code off your site (links and all) and pastes it on their site, the canonical will still say that your site is the original version of the post and you will still get the credit for the plagiarized content. A nofollow tag is when you tell the search engines not to follow a link on your website. It’s common to nofollow external links because you want the search engines to stay on your site. A noindex tag is used for pages you don’t want the search engines to index. One example of a noindex page might be a login page.on-page seo is about making improvements to the code of a webpage

Off-Page SEO

There are two most common types of off-page SEO which are guest posting and acquiring backlinks (which is sometimes contained in the act of guest posting. Guest posting involves contacting other websites and blogs in your niche and offering to write a free post for them (usually with a backlink to your site). There is a two-fold reason for this approach. The first is to get exposure to a different community and the second is for the backlink. Sometimes the link that will be offered for a guest blog is a nofollow link (meaning it doesn’t pass any “link juice”) so you’ll need to way for yourself if it’s more important for the publicity or the link.

Backlinks usually come from some sort of targeted outreach to individuals who you think would be interested in your content. The structure of the pitch varies person by person but in every pitch there’s an acknowledgement of what the other person is doing, what you’re doing, and how you think your resource would benefit their community (which is why you’re asking for the link). In any linkbuilding you do, just remember, links should always, first and foremost, be about helping visitors to find related content and visit your site.

like links in a chain, backlinks make your website stronger

Some of the links pointing to your site may be spammy in nature (with hundreds or thousands coming from the same source, keyword stuffing anchor text, or pointing to your site from blacklisted domains). These links may come to light during a backlink review of your site. All websites have some level of spammy links pointing to them, so don’t think that you’re going to get penalized that day. Unless it’s the majority of your link portfolio, work to “replace” them by acquiring better links. Then, once you’ve acquired the better link, you can disavow the old link.

Sitemaps

There are two primary types of sitemaps, both of which are important for SEO. The first, and most important type of sitemap is an XML sitemap. An XML sitemap is a list of pages on your website and indicates the URL, the last time it was updated, and the importance of the page. This is primarily used by search engines and is a guide for what pages you want indexed and findable from the search engine results page. The HTML sitemap is primarily for users who are having difficulty finding content on your website. It shows the structure of the site and where content can be found. It is also helpful for search engines who are trying to find pages on your site and understand the site structure.

like in reality, bots need maps to guide them as well

Robots.txt

The robots.txt file is one of the most important aspects of a website. This is a text file that will only be read by the search engines and not by human visitors (unless it’s your SEO consultant or web developer). This file can be found on any website by appending robots.txt to the root of any website URL (http://www.yoursite.com/robots.txt). This file contains instructions to the search engines telling them not to crawl folders or individual files on a site, and limiting what search engine crawlers can visit your site. In addition, you should also include the URL of your XML sitemap, to enable the crawlers to more easily digest and visit the pages on your site.

web crawlers analyze the code on your website

Site Speed

As web traffic is continuing to grow from mobile devices, the importance of having a fast website are all the more important. Your site should load and be usable in 2-3 seconds, if at all possible. There are many factors that go into site speed including, time to first byte, image size, minifying Javascript, HTML, and CSS, server location, browser caching, redirects, and render blocking Javascript.

Time to first byte is the time from when the site is requested from the server until the server responds and sends the first byte of information. Images when they are uploaded to a site can be larger than their intended size. This extra information, that doesn’t impact the quality of the image, can slow a site down. By compressing the image, you won’t have to send as much information which will mean an increase in the speed in which the site loads. By minifying Javascript, HTML, and CSS, you remove the unnecessary characters in the code which means that the code can be sent and executed faster. This doesn’t impact the job that code is doing for your site.

Server location can impact your site speed if your primary service location is located in a different country than your server. Browser caching is important because rather than having to request certain assets over and over again, you can store that information on the user’s device. A redirect occurs when you move the location of one resource to a different location. If you move the resource too many times (without updating the original code and instead adding another redirect), this will also slow your site down. Render blocking Javascript are resources that have to be executed before the site is allowed to load.

 

speed is important to your website, like in a bicycle race

Schema

Schema is a structured data markup that takes the content on your page and breaks it down into a format that is easily understood by the search engines. While it isn’t a guarantee that the search engines will use your markup and reward your site with richer search results (like stars, ratings, images, etc.) it is still helpful to set up. There are many different types of schema including:

  • Organization
  • Local Business
  • Article
  • Review
  • Video
  • Blogpost

The easiest way to add schema markup to a website is to write it in JSON-LD and execute the code through Google Tag Manager. A full list of the different types of schema markup available can be found at Schema.org. Some helpful user guides including information on implementation and examples can be found on Google’s developer website.

404 Errors

A 404 error on a website occurs when a web page is deleted from a site, without being redirected (the link of the old page points to a new page), when a URL link is incorrectly typed into the code, or when a user inputs a wrong URL. While 404 pages won’t hurt the health of your site, or make the search engines look unfavorably on your site, they do create a negative user experience for visitors who are looking that no longer exists. There are two ways that you can check your site for 404 errors: the first way is to check Google Search Console for 404 pages under the crawl errors section. The second method is to crawl your site with one of the crawling tools mentioned below and find your 404 pages.

404 error code on a screen

To resolve your 404 error pages, you’ll need to map the old URL to a new URL. You’ll want to redirect it to the nearest, most relevant page. Don’t just do a blanket redirect to your homepage. Once you know where the old URL needs to go on your current site, you’ll need to set up the redirects either through the server or through a plugin.

Tools and Analytics

There are several tools that should be set up and configured to give you information on how the search engines visit your site, as well as how human visitors are visiting your site. These tools are Google Analytics, Bing Webmaster Tools, and Google Search Console (formerly known as Google Webmaster Tools). Let’s look at each of these and the benefits that they provide.

it's important to have the right tools in your toolbox

Google Analytics

Google Analytics is one of the most accurate methods of knowing more about the visitors to your website. Google Analytics (GA), once added to a site, will instantly start collecting data, however, no historical data will be shown. GA will show you what pages are getting traffic, where those visits are coming from (organic, social, paid, email, direct, etc), new to returning visitor percentage, exit rate, conversions, behavior and demographic data, and much more! Google Analytics allows you to analyze what’s working and what isn’t working with your website.

 

Google Search Console

Google Search Console gives you insights into what Google finds as it crawls your website as well as website performance. You can view backlinks pointing to your site, anchor text of those links, any structured data/schema present on the site, international targeting, search analytics (what queries and pages have appeared in the Google Search results), and crawl/indexation information. It’s a quick and easy process to set up and the information it provides you is invaluable.

Bing Webmaster Tools

Bing Webmaster Tools gives you information about how the Bing search spiders interact with your website. Bing will give you many of the same metrics that Google Search Console will give you. The benefit of running Bing Webmaster Tools in conjunction with Google Search Console is sometimes Bing will pick up on a pattern that Search Console will miss. Additionally, rather than having information about your Bing website traffic reported through Google, you can get the information straight from the source.

Other Tools

There are many other tools that will be helpful in doing SEO for your site:

  • Broken Link Checker is a website that can be used to find broken links on your site, as well as tell you the URL of the bad link, in addition to showing it in the source code.
  • Screaming Frog is a crawler that mimics the search engine spiders. It will also tell you information about title tags, meta-description tags, H-tags, redirects, 404 pages, and much more. The free version will crawl up to 500 pages on your site.
  • Website Auditor is another crawler that you should have in your toolbox. Website Auditor analyzes your website and finds errors, keyword information, content optimization recommendations, and helps to improve your onsite SEO.
  • Semrush is a keyword research tool that is helpful when you’re trying to understand how people are searching for your website, how to optimize your content, and what your competitors are doing.

What types of tactics and tools do you find helpful? Let me know in the comments below. Did you enjoy this article? Subscribe to my newsletter to be notified when I publish new content.

 

Please follow and like us:
Was this post helpful?
Categories: Blog

0 Comments

Leave a Reply

Avatar placeholder
Malcare WordPress Security