The goal of SEO is to increase your search visibility, which in turn will increase your site traffic.
Google (and other search engines) ranks sites based on a combination of two broad categories: relevance and authority.
Relevance is how closely the content of a page is going to meet a user’s needs and expectations (known as matching search intent).
Domain Authority is a measure of how trustworthy or authoritative the source of the content is.
Your search engine optimization (SEO) tactics will usually involve building your authority, increasing your relevance for targeted queries, or both, across three main areas of optimization:
Don’t worry. I’m going to make this as painless as possible.
In this section, I’m going to cover most of the technical SEO elements that you’ll need to consider for your campaign.
These SEO resources are not only presented to increase your SEO knowledge, but also provide you detailed, actionable steps so you can make appropriate changes to your site, understand the search engine ranking factors you’ll need to consider or monitor and potential shore-up any technical issues that could come up during your SEO efforts.
I’m going to cover these SEO basics as simply and as thoroughly as possible—so you can understand them and use them, no matter how much technical experience you have.
When you go to a library for information, librarians can probably help you by finding a book.
But no matter how relevant a book may be to your interests, it won’t matter if the book isn’t currently on the shelves.
Libraries must acquire books as they’re released, updating old copies and adding new copies, to keep the most recent information on the shelves.
To provide quality organic search results, Google (and other search engines) work to maintain shelves of “books,” in this case, a running archive of websites and web pages that are available on the web.
Google uses automated search engine bots, sometimes known as “crawlers” or “spiders” to continually search the web for new web page entries, which it then logs in its central system.
How is this relevant for you?
If you want to be listed in search engines, and be listed accurately, you need to make sure your site is indexed correctly.
There are three main approaches you can take for search indexing:
You’ll also need to consider creating a robots.txt file for your site, which is essentially an instruction manual that tells Google’s web crawlers what to look at on your site. You can create this file using Notepad, or any program on your computer that allows you to create txt files—even if you have no coding experience.
On the first line, you’ll specify an agent by typing: “User-Agent: ____”, filling in the blank with a bot name (like “Googlebot”) or using an * symbol to specify all bots. Then, on each successive line, you can type “Allow:” or “Disallow:” followed by specific URLs to instruct bots which pages should or should not be indexed. There are various reasons why you wouldn’t want a bot to index a page on your site, which I’ll get into later. However, you may want bots to index all pages of your site by default. If this is the case, you do not need a robots.txt file.
If and when your robots.txt file is ready, you can upload it to your site’s root directory like any other file.
Improving website speed for SEO has been an oft-repeated topic in digital marketing, as its importance has been somewhat overblown.
While it is included in SEO basics and SEO 101 (and now is a ranking factor in search engines), the loading time of your web pages won’t make or break your rankings in search engine results pages (SERPs); reducing your load time by a second won’t magically boost a low-authority site to the top rank in the search results.
However, site speed is still an important consideration—both for your domain authority and for the user experience of your site. Google (and other search engine algorithms) rewards sites that provide content faster, as it is conducive to a better overall user experience, but it only penalizes about one percent of sites for having insufficient speed. When it comes to user experience, every one second in improved site speed is shown to be correlated with a two percent increase in conversions.
In short, whether you’re after higher rankings in search results or more conversions, it’s a good idea to improve your site speed. It’s slow page speed search engines punish, but really the punishment only comes in the most egregious cases.
You can check your speed using a site like Google’s own Page Speed Insights, and start improving your site with the following strategies:
Mobile optimization is a broad category that includes both technical and non-technical elements. Mobile searches now outnumber desktop searches, so search engines have taken extensive efforts in recent years to reward sites that optimize for mobile devices and penalize sites that don’t.
Put simply, if your site is “friendly” to mobile devices, capable of loading and presenting content in a way that works well for mobile users, you’re going to see an increase in authority and rankings in search results. Incidentally, you’ll also become more appealing to your target demographics, possibly increasing customer loyalty and/or conversions.
So what is it that makes sites “optimized” for mobile devices? There are a few main criteria:
If all this sounds complex to you, don’t worry. There are some simple ways to test your site to see if it’s counted as “mobile friendly,” and simple fixes you can make if it’s not. The easiest way to make your site mobile friendly is to make your site responsive; this means that your site will detect what device is attempting to view it, and automatically adjust based on those parameters.
This way, you can keep managing only one site, and have it work for both mobile and desktop devices simultaneously. You can also create a separate mobile version of your site, but this isn’t recommended; especially now that Google is beginning to switch to mobile-first indexing.
How can you make your site responsive? The easiest way is to use a website builder and choose a responsive template. Most mainstream website builders these days have responsive templates by default, so you’ll be hard-pressed to find one that doesn’t offer what you need.
If you’re building a site from scratch, you’ll need to work with your designers and developers to ensure they’re using responsive criteria.
As long as your site is responsive, you should be in good shape. If you’re in doubt, you can use Google’s mobile-friendly tool to evaluate your domain and see if there are any mistakes interfering with your mobile optimization. All you have to do is enter your domain, and search engines will tell you if any of your pages are not up to snuff, pinpointing problem areas so you can correct them if necessary.
I’ve mentioned the importance of sitemaps in multiple areas of this guide so far. Now I’m going to get into the technical details of what sitemaps are, why they’re important for your site, and how to create them.
There are actually two different types of sitemaps you can build and use for your site: HTML and XML. I’ll start with HTML sitemaps, since they’re a little easier to create and understand. As I mentioned before, HTML sitemaps exist as a page on your site, visible to both human visitors and search engine crawlers.
Here, you’ll list a hierarchy of all the pages on your site, starting with the “main” pages, and splitting down into categories and subcategories. Ideally, you’ll include the name of the page along with the accurate link to it, and every page on your site will link to your HTML sitemap in the footer.
Search engines won’t be using an HTML sitemap to index your pages, so it’s not explicitly necessary to have one. However, it does give Google search engine crawlers a readily available guidebook of how your pages relate to one another. It can also be useful for your visitors, giving them an overall vision of your site.
XML sitemaps are far more important. Rather than existing as a page on your site, XML sitemaps are code-based files that you can “feed” to Google directly in Google Search Console. They look a little like this:
As you can imagine, they’re a nightmare to produce manually, but there are lots of free and paid tools you can use to generate one.
Before I explain XML sitemap generation, you need to know what they’re used for. Again, these aren’t going to determine whether or not Google indexes your site; Google is going to crawl your site anyway. Instead, Uploading your XML sitemap to Google will instruct Google which pages you find most valuable on your site, and how those pages relate to one another.
For example, you could exclude technical pages of your site that contain fewer than 200 words, so the overall perceived quality of your site isn’t dragged down by your worst content.
Google explains that XML sitemaps are especially useful for the following types of sites:
Note that excluding a page from your XML sitemap doesn’t mean that page won’t be indexed; the only way to fully block indexation altogether is to use your robots.txt file (as I described earlier).
Does this all sound too complex? Don’t worry; the actual process you use to create a sitemap is fairly simple. Most CMSs have built-in features that allow you to automatically generate both HTML and XML sitemaps; for example, Yoast’s SEO plugin gives you the ability to create dynamic sitemaps, which automatically update as you make changes to your site.
For example, you could exclude pages of your site that fall short of a given word-count threshold, and if you add content, they’ll automatically begin to reappear.
It’s helpful to know how sitemaps work and why they’re important, but for your own sanity, it’s best to leave their generation in the hands of automated apps.
What I’m going to refer to as “meta data” is a blanket category that includes page titles, meta descriptions, and alt text.
These are sections of text that describe your pages (or specific pieces of content within those pages).
They exist in the code of your site, and are visible to Google search crawlers, but aren’t always visible to visitors (at least not in a straightforward way).
Google’s crawlers review this information and use it to categorize certain features of your site, including pages (as a whole) and piece of content within that page).
This makes it helpful for optimizing your site for specific and relevant keywords and phrases.
It’s also used to produce the entries in search engine results pages (SERPs) that users will come across.
Accordingly, it’s important to optimize your meta data to ensure that prospective visitors are encouraged to click through to your site.
The title tags of your page will appear first, followed by your page URL in green, followed by your meta description, as shown in the example below:
Your goals in optimizing the meta information of your site then, is to first ensure that Google is getting an accurate description of your content, and second to entice users to click through to your site.
Thankfully, optimizing your meta data is fairly simple. Most CMS platforms will, for each page of your site, offer blank, clearly labeled boxes that let you edit the corresponding meta data for that page.
Remember, it’s a good idea to include at least one keyword in each of your titles and descriptions, but you’ll want to avoid keyword stuffing, and focus on writing meta data that makes sense to your users.
The last component of technical search engine optimization (SEO) I want to cover is the possibility for technical errors; these are common things that can (and probably will) go wrong with your site, causing a hiccup in your rankings and interfering with your plans.
If you notice your site isn’t ranking the way it should in search results, or if something has dramatically changed without your notice (and no immediately clear underlying cause), your first troubleshooting step should be checking for the following technical errors:
<link rel=”canonical” href=”https://seo.co/examplepage/”>
If you have an SEO plugin, you may be able to enter the canonical link manually, like you did with titles and meta descriptions.
Alternatively, you could use 301 redirects to clarify duplicate content discrepancies, but it’s arguably easier to set up canonical tags.
There are some other technical issues you may encounter, such as images not loading properly, but many of them are preventable if you follow best practices, and are easily resolvable with a quick Google search. Even if you don’t understand exactly what’s happening or why, following step-by-step instructions written by experts is a fast way for even amateurs to solve complex SEO problems.
In this section, I want to cover some of the “non-technical” tactics you’ll need to have a successful search engine optimization (SEO) campaign. None of these strategies requires much technical expertise, but it’s important to understand that the technical ranking factors I listed above aren’t the only thing you’ll need to grow your search results rankings over time.
Keep in mind that each of these categories is rich in depth, and requires months to years to fully master, and these entries are mere introductions to their respective topics.
Without high-quality content writing, your Google SEO campaign will fail. You need at least 300 words of highly descriptive, concisely written content on every page of your site, and you’ll want to update your on-site blog at least two or three times a week with dense, informative, practical content—preferably of 700 words or more. This content will give search engines more pages and more content to crawl and index.
Collectively, they’ll add to the domain authority and individual page authorities of your site pages, and they’ll provide more opportunities for your site visitors to interact with your brand and your site. Here are some resources to help you create and publish high-quality content:
All that on-site content also gives you the opportunity to optimize for specific target keywords. Initially, you’ll select a number of “head” keywords (usually limited in length, and highly competitive) and “long-tail” keywords (longer in length, usually representing a conversational phrase, and less competitive) to optimize for.
When performing your keyword research, you’ll choose terms with high potential organic traffic (typically measured by search volume) and low competition, then you’ll include those terms throughout your site, especially favoring your page titles and descriptions. You’ll want to be careful not to over-optimize here, as including too many keywords on a given page (or your site in general) could trigger a content quality-related penalty from Google.
Keyword research is best performed using your favorite keyword research tool. We recommend Moz, Ahrefs, Cora, etc.
The best keyword research provides deep analysis of your competitors’ websites, giving you the data on what they are doing and how they are targeting various keywords, allowing them to rank above you in search results.
Your SEO strategy should involve diving deep into keyword research to find the keywords with the search volume and value that matter to conversions.
Some keywords matter much more than others, even those with low search volume, but which target high commercial search intent.
These are those you should be focusing on in your extensive keyword research.
While keyword research isn’t technical SEO, it’s certainly one of the best methods for sourcing what terms you should be targeting for improved organic search results.
Authority is partially calculated based on the quality and appearance of your site, but the bigger ranking factor is the quantity and quality of links you have pointing to your site. Link building is an SEO strategy that enables you to create more of these links, and therefore generate more authority for your brand.
Old-school link building tactics are now considered spammy, so modern link builders use a combination of guest posting on external authority publishers and naturally attracting links by writing high-quality content and distributing it to attract shares and inbound links. In any case, you’ll need to invest in your link building tactics if you want your campaign to grow. For help, see our SEO link building guide.
Finally, none of your tactics are going to be worthwhile unless you can measure and interpret the search engine results they’re generating. At least monthly, you’ll want to run an analysis of your work, measuring things like inbound organic traffic, ranking for your target keywords, and of course, checking for any technical errors that have arisen.
By interpreting these results and comparing them to the amount of money you’ve invested in your campaign, you’ll get a clear picture of your return on investment—your ROI—and can then make adjustments to improve your profitability. For help, see The Ultimate Guide to Measuring and Analyzing ROI On Your Content Marketing Campaign, our SEO ROI Guide or our SEO pricing guide.
Hopefully, after reading this guide to SEO basics, all the technical SEO details should seem a lot less technical and you’re ready to start improving your site’s SEO.
If you’ve followed this SEO beginners guide step-by-step, you should have been able to tackle tasks like building robots.txt files and improving your site’s speed even if you don’t have experience in creating or managing websites.
Here we cover some of the most important fundamentals of SEO.
And while these basic SEO 101 tips can help you through the SEO basics of technical SEO, it’s important to realize that search engine optimization is a deep and complex strategy with far more considerations than an SEO guide like this can comprehensively cover.
If you’re interested in further help in your SEO efforts or would like us to help implement a customized SEO strategy, be sure to contact us for more guidance and expertise so we can help you toward your SEO success!