• Redeem Your Offer - SEO.co

  • SEO 101: An SEO Beginner’s Guide to On-Site & Off-Site SEO

    SEO 101: An SEO Beginner’s Guide to On-Site & Off-Site SEO

    This SEO guide for beginners is meant to provide high-level SEO basics for SEO newbies.

    The goal of SEO is to increase your search visibility, which in turn will increase your site traffic.

    Google (and other search engines) ranks sites based on a combination of two broad categories: relevance and authority.

    Relevance is how closely the content of a page is going to meet a user’s needs and expectations (known as matching search intent).

    Domain Authority is a measure of how trustworthy or authoritative the source of the content is.

    Google’s ranking system calls this the E-E-A-T framework:

    • Experience
    • Expertise
    • Authoritativeness
    • Trustworthiness

    Your search engine optimization (SEO) tactics will usually involve building your authority (aka E-E-A-T), increasing your relevance for targeted queries, or both, across three main areas of optimization:

    • On-site optimization. On-site optimization is the process of making your site more visible, more authoritative, and easier for Google’s web crawlers to parse and understand. Many of these tweaks and strategies involve technical changes to your site, including adjustments to your backend code and other structural site changes.
    • Ongoing content marketing. Content and search engine marketing is the best way to build your authority and relevance on-site over time; you’ll have the chance to choose topics and optimize for keyword phrases your target audience will use, and simultaneously create content that proves your authoritativeness on the subject.
    • Off-site optimization (link building). Off-page optimization is a collection of tactics designed to promote your on-site content and improve your authority by building links yourself or hiring a link building service to build them for you. The quantity and quality of links pointing to your site has a direct influence on how much authority your site is perceived to have and, eventually, how you rank in search results.

    Technical SEO for Beginners

    Don’t worry. I’m going to make this as painless as possible.

    In this section, I’m going to cover most of the technical SEO elements that you’ll need to consider for your campaign.

    These SEO resources are not only presented to increase your SEO knowledge, but also provide you detailed, actionable steps so you can make appropriate changes to your site, understand the search engine ranking factors you’ll need to consider or monitor and potential shore-up any technical issues that could come up during your SEO efforts.

    I’m going to cover these SEO basics as simply and as thoroughly as possible—so you can understand them and use them, no matter how much technical experience you have.

    Search Engine Indexing

    When you go to a library for information, librarians can probably help you by finding a book.

    But no matter how relevant a book may be to your interests, it won’t matter if the book isn’t currently on the shelves.

    Libraries must acquire books as they’re released, updating old copies and adding new copies, to keep the most recent information on the shelves.

    Search indexing works similarly.

    To provide quality organic search results, Google (and other search engines) work to maintain shelves of “books,” in this case, a running archive of websites and web pages that are available on the web.

    Google uses automated search engine bots, sometimes known as “crawlers” or “spiders” to continually search the web for new web page entries, which it then logs in its central system.

    Why is Indexing Relevant for Ranking in the Search Engines? 

    If you want to be listed in search engines, and be listed accurately, you need to make sure your site is indexed correctly.

    There are three main approaches you can take for search indexing:

    Passive SEO & Website Management 

    The first approach is the easiest, and probably the best for SEO beginners.

    Search engines want to keep the books on its shelves updated, so it makes an effort to crawl sites completely on its own.

    In the passive approach, you’ll simply wait for search engines to index your site, and trust its best judgment when it comes to canonicalizing and optimizing your URL structures.

    For this method, you don’t have to do anything; you simply pass the reins to Google and let it take care of the indexing work.

    The only potential disadvantage here, other than forfeiting some degree of control, is that it sometimes takes more time for search engines to update its index—up to a few weeks for new sites and new material.

    Active SEO & Website Management 

    The active approach allows you to update the URL structures and hierarchies on your site using an on-site site map.

    Known as an HTML sitemap, this is easy to create (so long as you’re familiar with the process of creating new pages on your site).

    Create a web page called “Sitemap” and list all the pages on your site you want search engines to index, separated into categories and subcategories as appropriate, to provide hints to bots as to how your links interact with one another.

    You should also include descriptions to identify what each link is used for (briefly).

    This doesn’t guarantee SEO rankings or indexation, but can help clarify confusion and speed up the indexing process.

    The major disadvantage here is that you’ll need to adjust it every time you make changes to your site, unless you use an automatic sitemap solution which updates itself any time you publish new pages.

    There are WordPress plugins which offer this functionality.

    seo sitemap for beginners

    A clean, effective sitemap is critical to optimizing on page SEO and on site SEO.

    Direct Website Management 

    In the direct route, you’ll create an XML sitemap—which is different from an HTML sitemap.

    It’s essentially a txt file that contains a list of your site’s URLs, with descriptions that inform search engines how to consider and index your links, in relation to one another.

    Once done, you’ll upload it directly to Google.

    This is a fair bit more complex than an HTML sitemap, but is manageable if you take the time to read Google’s instructions properly.

    This isn’t necessary, but could be useful in speeding up the initial indexing process and clarifying canonical confusion (which I’ll talk about more in a future section).

    You’ll also need to consider creating a robots.txt file for your site, which is essentially an instruction manual that tells Google’s web crawlers what to look at on your site.

    You can create this file using Notepad, or any program on your computer that allows you to create txt files—even if you have no coding experience.

    On the first line, you’ll specify an agent by typing: “User-Agent: ____”, filling in the blank with a bot name (like “Googlebot”) or using an * symbol to specify all bots.

    Then, on each successive line, you can type “Allow:” or “Disallow:” followed by specific URLs to instruct bots which pages should or should not be indexed.

    There are various reasons why you wouldn’t want a bot to index a page on your site, which I’ll get into later.

    However, you may want bots to index all pages of your site by default. If this is the case, you do not need a robots.txt file.

    For example:

    robots.txt file

    If and when your robots.txt file is ready, you can upload it to your site’s root directory like any other file.

    Site Speed and Core Web Vitals 

    Improving website speed for SEO has been an oft-repeated topic in digital marketing, as its importance has been somewhat overblown.

    While it is included in SEO basics and SEO 101 (and now is a ranking factor in search engines), the loading time of your web pages won’t make or break your rankings in search engine results pages (SERPs); reducing your load time by a second won’t magically boost a low-authority site to the top rank in the search results.

    However, site speed is still an important consideration—both for your domain authority and for the user experience of your site.

    Google (and other search engine algorithms) rewards sites that provide content faster, as it is conducive to a better overall user experience, but it only penalizes about one percent of sites for having insufficient speed.

    When it comes to user experience, every one second in improved site speed is shown to be correlated with a two percent increase in conversions.

    In short, whether you’re after higher rankings in search results or more conversions, it’s a good idea to improve your site speed. It’s slow page speed search engines punish, but really the punishment only comes in the most egregious cases.

    Google page speed analysis

    You can check your speed using a site like Google’s own Page Speed Insights, and start improving your site with the following strategies:

    • Use a good caching plugin. Your first job is to make sure there’s a good caching plugin on your site. You only need one, and unless you have technical experience and unique needs, it’s best to leave your plugin unaltered (i.e., leave the default settings as they are). The caching plugin allows users to store certain pieces of information about your site on their respective browsers. This won’t do much for first-time visitors, but repeat visitors will be able to load your site much more quickly.
    • Limit the number of plugins you use. Your caching plugin is a must, and you’ll need a handful of other plugins (including a search engine optimization plugin), but try to limit the number of plugins you have on your site. Every additional plugin will represent an increase in the amount of time it takes users to load your site.
    • Compress what you can. You can use an automated compression program like GZip to reduce the size of the files on your site, so they load faster. It’s not an intensive process, but can shave a few milliseconds off your page loading times.
    • Limit your redirects. Redirects are sometimes essential for correcting site indexing errors and other issues—and I’ll talk about redirects in more detail later on—but only use them if you know what you’re doing. Every new redirect you create is another piece of information that can bog down the speed of your site.
    • Consider your server choice. Your choice in server can also have a bearing on your website loading speed. Most modern servers are adequate—especially big-name servers, like those provided by WordPress or GoDaddy. However, choosing an inferior, low-cost server could have a negative impact on your average loading speed. A dedicated server may be worth the investment if site speed is a big priority for you. In any case, once you’ve made a decision, your server won’t need much ongoing technical maintenance (unless you’re using one in-house).
    • Optimize your images. Images are some of the biggest content files you’ll have on your site, so you’ll need to make sure they’re optimized to provide the fastest possible site speed. You can optimize images by making sure they’re the proper format (JPG, GIF, PNG, etc.), and by reducing their size as much as possible before uploading them. This isn’t a technically demanding process; in fact, there are many free image resizing tools available online, including Pic Resize.
    • Clear unnecessary site data. Do you have a bunch of old content drafts for your blog that haven’t been published? Get rid of them. Every piece of information on your site that doesn’t have a relevant purpose should be cleared.
    • Consider a content delivery network (CDN). A content delivery network is an automatic service you can sign up for that allows you to serve, or distribute, your site content from multiple different locations simultaneously, rather than from one central server. It’s an additional investment, but doesn’t require any technical knowledge, and could help you achieve a faster loading time if you’re struggling to hit your goals with other tactics.

    Mobile Site Optimization

    Mobile optimization is a broad category that includes both technical and non-technical elements. Mobile searches now outnumber desktop searches, so search engines have taken extensive efforts in recent years to reward sites that optimize for mobile devices and penalize sites that don’t.

    Put simply, if your site is “friendly” to mobile devices, capable of loading and presenting content in a way that works well for mobile users, you’re going to see an increase in authority and rankings in search results. Incidentally, you’ll also become more appealing to your target demographics, possibly increasing customer loyalty and/or conversions.

    So what is it that makes sites “optimized” for mobile devices? There are a few main criteria:

    • Content visibility. First, you’ll need to make sure that all your site’s content is visible to a user—without the need to scroll or zoom. On a non-optimized website, written text will often bleed to the right, forcing users to scroll to read the rest of it. On a mobile optimized site, that text would be constrained by the edges of the screen.
    • Content readability. Your content should also be readable. Oftentimes, that means choosing a bigger, cleaner font. Mobile devices have smaller screens, so you don’t want your visitors to squint or zoom to have to read it.
    • Finger-friendly interactions. Instead of using a mouse with a fairly precise pointer to engage with your site, users are going to be using their fingers to tap buttons and fill out forms. Accordingly, your buttons, tabs, and menus should grow to be more prominent and “tappable.”
    • Image and video visibility. There are some types of content that simply don’t load on mobile devices (such as Flash). Obviously, you’ll want your visitors to see all your cool images and videos, so mobile optimization demands that those features are visible on mobile devices.
    • Loading speed. Remember what I talked about in the site speed section? It matters even more for mobile devices. Generally, mobile devices load sites much slower than desktop devices, so a fraction of a second delay on a desktop device could cost you multiple seconds on a mobile device. Fortunately, mobile speed improvements are mostly the same as desktop speed improvements.

    If all this sounds complex to you, don’t worry. There are some simple ways to test your site to see if it’s counted as “mobile friendly,” and simple fixes you can make if it’s not. The easiest way to make your site mobile friendly is to make your site responsive; this means that your site will detect what device is attempting to view it, and automatically adjust based on those parameters.

    This way, you can keep managing only one site, and have it work for both mobile and desktop devices simultaneously. You can also create a separate mobile version of your site, but this isn’t recommended; especially now that Google is beginning to switch to mobile-first indexing.

    How can you make your site responsive? The easiest way is to use a website builder and choose a responsive template. Most mainstream website builders these days have responsive templates by default, so you’ll be hard-pressed to find one that doesn’t offer what you need.

    If you’re building a site from scratch, you’ll need to work with your designers and developers to ensure they’re using responsive criteria.

    As long as your site is responsive, you should be in good shape. If you’re in doubt, you can use Google’s mobile-friendly tool to evaluate your domain and see if there are any mistakes interfering with your mobile optimization. All you have to do is enter your domain, and search engines will tell you if any of your pages are not up to snuff, pinpointing problem areas so you can correct them if necessary.

    Google mobile friendly tool

    XML Sitemaps

    I’ve mentioned the importance of sitemaps in multiple areas of this guide so far.

    Now I’m going to get into the technical details of what sitemaps are, why they’re important for your site, and how to create them.

    There are actually two different types of sitemaps you can build and use for your site: HTML and XML. I’ll start with HTML sitemaps, since they’re a little easier to create and understand. As I mentioned before, HTML sitemaps exist as a page on your site, visible to both human visitors and search engine crawlers.

    Here, you’ll list a hierarchy of all the pages on your site, starting with the “main” pages, and splitting down into categories and subcategories. Ideally, you’ll include the name of the page along with the accurate link to it, and every page on your site will link to your HTML sitemap in the footer.

    Search engines won’t be using an HTML sitemap to index your pages, so it’s not explicitly necessary to have one. However, it does give Google search engine crawlers a readily available guidebook of how your pages relate to one another. It can also be useful for your visitors, giving them an overall vision of your site.

    XML sitemaps are far more important. Rather than existing as a page on your site, XML sitemaps are code-based files that you can “feed” to Google directly in Google Search Console. They look a little like this:

    sitemap feed for seo for beginners

    As you can imagine, they’re a nightmare to produce manually, but there are lots of free and paid tools you can use to generate one.

    Before I explain XML sitemap generation, you need to know what they’re used for. Again, these aren’t going to determine whether or not Google indexes your site; Google is going to crawl your site anyway. Instead, Uploading your XML sitemap to Google will instruct Google which pages you find most valuable on your site, and how those pages relate to one another.

    For example, you could exclude technical pages of your site that contain fewer than 200 words, so the overall perceived quality of your site isn’t dragged down by your worst content.

    Google explains that XML sitemaps are especially useful for the following types of sites:

    • Large sites, with thousands of pages.
    • Sites with archived, poorly linked content, which makes it difficult for Google to understand how all your pages link to one another.
    • New sites, which have few external links pointing to them.
    • Sites using specific types of rich media, such as special annotations or visual media.

    Note that excluding a page from your XML sitemap doesn’t mean that page won’t be indexed; the only way to fully block indexation altogether is to use your robots.txt file (as I described earlier).

    Does this all sound too complex? Don’t worry; the actual process you use to create a sitemap is fairly simple. Most CMSs have built-in features that allow you to automatically generate both HTML and XML sitemaps; for example, Yoast’s SEO plugin gives you the ability to create dynamic sitemaps, which automatically update as you make changes to your site.

    For example, you could exclude pages of your site that fall short of a given word-count threshold, and if you add content, they’ll automatically begin to reappear.

    It’s helpful to know how sitemaps work and why they’re important, but for your own sanity, it’s best to leave their generation in the hands of automated apps.

    Meta Data and Alt Text

    What I’m going to refer to as “meta data” is a blanket category that includes page titles, meta descriptions, and alt text.

    These are sections of text that describe your pages (or specific pieces of content within those pages).

    They exist in the code of your site, and are visible to Google search crawlers, but aren’t always visible to visitors (at least not in a straightforward way).

    Google’s crawlers review this information and use it to categorize certain features of your site, including pages (as a whole) and piece of content within that page).

    This makes it helpful for optimizing your site for specific and relevant keywords and phrases.

    It’s also used to produce the entries in search engine results pages (SERPs) that users will come across.

    Accordingly, it’s important to optimize your meta data to ensure that prospective visitors are encouraged to click through to your site.

    The title tags of your page will appear first, followed by your page URL in green, followed by your meta description, as shown in the example below:

    beginner SEO meta description

    Your goals in optimizing the meta information of your site then, is to first ensure that Google is getting an accurate description of your content, and second to entice users to click through to your site.

    Titles

    Title tags are the first and most important description of your site’s individual pages.

    They should include at least one target keyword relevant to that page (and your site), your brand name at the end, and should make some logical sense to your visitors. They should also contain less than 60 characters, as this is the maximum displayed by SERPs. For blog articles, title tags usually correspond with the title of that blog post.

    Descriptions

    Descriptions are secondary ways of describing your pages, and generally have more wiggle room to include secondary keywords, long-tail phrases, and more conversational phrasing. The limit here is 160 characters.

    Alt text

    Alt text is specific to images, and is important for both search engines and visitors.

    When uploading an image, you’ll need to make sure your image file name reflects what the image actually contains—this will function as the image’s title in search engines.

    You may also include a caption to correspond with that image.

    Beyond that, you’ll need some descriptive text, which helps Google “understand” what’s happening in your image; this is the alt text, and it’s usually editable directly within your CMS.

    The alt text will also appear, instead of the image, in any case where a user attempts to load the image but is unable.

    Thankfully, optimizing your meta data is fairly simple. Most CMS platforms will, for each page of your site, offer blank, clearly labeled boxes that let you edit the corresponding meta data for that page.

    Structured Data Markup

    Structured data markup is a way of providing explicit information about the content on a webpage to search engines.

    It uses a standardized format to organize and classify data, making it easier for search engines to understand the context and meaning of the content.

    This structured data is typically added to the HTML code of a webpage.

    The most common structured data markup format is Schema.org, which is a collaborative effort by major search engines like Google, Bing, and Yahoo.

    Schema.org provides a vocabulary of tags that can be added to HTML to mark up different types of content, such as articles, reviews, events, products, and more.

    Remember, it’s a good idea to include critical keywords (including LSI and entity keywords) in your titles, descriptions and structured data markup.

    But you’ll want to avoid keyword stuffing, and focus on writing meta data that makes sense to your users.

    Technical SEO Errors

    The last component of technical search engine optimization (SEO) I want to cover is the possibility for technical errors; these are common things that can (and probably will) go wrong with your site, causing a hiccup in your rankings and interfering with your plans.

    If you notice your site isn’t ranking the way it should in search results, or if something has dramatically changed without your notice (and no immediately clear underlying cause), your first troubleshooting step should be checking for the following technical errors:

    Crawl errors

    Crawl errors happen when Google attempts to crawl your site but is somehow unsuccessful.

    There are a variety of potential culprits here, but thankfully, Google makes it easy to figure out what’s happening.

    Within Google Search Console, you can run a “crawl error” report that plainly states what’s going on with your site and why.

    There are a handful of potential crawl errors that could happen here; for example, there could be a DNS error that doesn’t prevent bots from accessing your site, but could cause latency problems.

    In this case, you’ll need to repair any problems with your DNS server and make sure Google can access your site as intended.

    You may also have a server problem, which is probably the most complex problem you’ll face in technical search engine optimization, since there could be so many root causes (and so many potential fixes).

    The potential solutions here extend beyond the scope of this guide, but usually involve diagnosing issues with your hosting provider.

    Fortunately, they should be few and far between.

    Robots.txt errors will also appear in this report.

    404 errors

    In Google Search Console, you’ll also be able to scan for 404 errors. 404 errors won’t seriously negatively affect your search rankings, but may be an indication of a bigger problem, and could irritate your visitors.

    The biggest root cause of 404 errors is page deletion, but may also be a symptom of a hosting problem.

    You can correct 404 errors easily by restoring a page that was deleted, diagnosing any problems with your hosting provider, or creating a 301 redirect. 301 redirects take incoming referral and organic traffic to a page and forward it to a different, more relevant page.

    Even if you aren’t an experienced programmer, you should be able to follow the basic step-by-step instructions necessary to set a redirect up.

    Broken links

    Broken or “dead” links come in several varieties.

    They might be internal or external, and they might be due to a typographical error in the site link, or due to a 404 error for the intended page.

    In any case, they no longer take users to a functional page.

    If these links exist on your own site, you can remove them or fix them by replacing them with a new destination URL.

    If they exist on an external site (ie, an external site links to a page on your site which returns a 404 error), you can set up a 301 redirect to a better, functional page, or reach out to the webmaster to ask that the link be updated.

    You can use Google’s internal links report to check for broken links on your own site, or a backlink search engine like Open Site Explorer to check for broken links on external sites.

    Duplicate content

    Duplicate content is an often misunderstood technical error. This isn’t necessarily an instance of intentionally duplicated or plagiarized content; instead, it’s usually due to a single page of content being indexed with multiple URLs, such as being indexed as both a https:// and https://www Google Search Console has a duplicate content report that can help you track down these instances.

    They won’t necessarily hurt your search rankings, but it’s better to clean these up to avoid misunderstandings by users or search bots.

    The way to do this is with a canonical tag, which is simple to implement. All you have to do is choose a primary or “canonical” page (flip a coin if you can’t decide), and add a canonical link from the non-canonical version to the canonical one.

    A canonical tag looks like this:

    <link rel=”canonical” href=”https://seo.co/examplepage/”>

    If you have an SEO plugin, you may be able to enter the canonical link manually, like you did with titles and meta descriptions.

    Alternatively, you could use 301 redirects to clarify duplicate content discrepancies, but it’s arguably easier to set up canonical tags.

    There are some other technical issues you may encounter, such as images not loading properly, but many of them are preventable if you follow best practices, and are easily resolvable with a quick Google search. Even if you don’t understand exactly what’s happening or why, following step-by-step instructions written by experts is a fast way for even amateurs to solve complex SEO problems.

    Non-Technical, Basic SEO for Beginners

    In this section, I want to cover some of the “non-technical” tactics you’ll need to have a successful search engine optimization (SEO) campaign. None of these strategies requires much technical expertise, but it’s important to understand that the technical ranking factors I listed above aren’t the only thing you’ll need to grow your search results rankings over time.

    Keep in mind that each of these categories is rich in depth, and requires months to years to fully master, and these entries are mere introductions to their respective topics.

    High-Quality Content

    Without high-quality content writing, your Google SEO campaign will fail. You need at least 300 words of highly descriptive, concisely written content on every page of your site, and you’ll want to update your on-site blog at least two or three times a week with dense, informative, practical content—preferably of 700 words or more. This content will give search engines more pages and more content to crawl and index.

    Collectively, they’ll add to the domain authority and individual page authorities of your site pages, and they’ll provide more opportunities for your site visitors to interact with your brand and your site. Here are some resources to help you create and publish high-quality content:

    Keyword Optimization

    All that on-site content also gives you the opportunity to optimize for specific target keywords. Initially, you’ll select a number of “head” keywords (usually limited in length, and highly competitive) and “long-tail” keywords (longer in length, usually representing a conversational phrase, and less competitive) to optimize for.

    When performing your keyword research, you’ll choose terms with high potential organic traffic (typically measured by search volume) and low competition, then you’ll include those terms throughout your site, especially favoring your page titles and descriptions. You’ll want to be careful not to over-optimize here, as including too many keywords on a given page (or your site in general) could trigger a content quality-related penalty from Google.

    Keyword research is best performed using your favorite keyword research tool. We recommend Moz, Ahrefs, Cora, etc.

    The best keyword research provides deep analysis of your competitors’ websites, giving you the data on what they are doing and how they are targeting various keywords, allowing them to rank above you in search results.

    Your SEO strategy should involve diving deep into keyword research to find the keywords with the search volume and value that matter to conversions.

    Some keywords matter much more than others, even those with low search volume, but which target high commercial search intent.

    These are those you should be focusing on in your extensive keyword research.

    While keyword research isn’t technical SEO, it’s certainly one of the best methods for sourcing what terms you should be targeting for improved organic search results.

    Link Building Services 

    Authority is partially calculated based on the quality and appearance of your site, but the bigger ranking factor is the quantity and quality of links you have pointing to your site. Link building is an SEO strategy that enables you to create more of these links, and therefore generate more authority for your brand.

    Old-school link building tactics are now considered spammy, so modern link builders use a combination of guest posting on external authority publishers and naturally attracting links by writing high-quality content and distributing it to attract shares and inbound links. In any case, you’ll need to invest in your link building tactics if you want your campaign to grow. For help, see our SEO link building guide.

    Search Engine Ranking Analysis and Reporting

    Finally, none of your tactics are going to be worthwhile unless you can measure and interpret the search engine results they’re generating. At least monthly, you’ll want to run an analysis of your work, measuring things like inbound organic traffic, ranking for your target keywords, and of course, checking for any technical errors that have arisen.

    By interpreting these results and comparing them to the amount of money you’ve invested in your campaign, you’ll get a clear picture of your return on investment—your ROI—and can then make adjustments to improve your profitability. For help, see The Ultimate Guide to Measuring and Analyzing ROI On Your Content Marketing Campaign, our SEO ROI Guide or our SEO pricing guide.

    Conclusion

    Hopefully, after reading this guide to SEO basics, all the technical SEO details should seem a lot less technical and you’re ready to start improving your site’s SEO.

    If you’ve followed this SEO beginners guide step-by-step, you should have been able to tackle tasks like building robots.txt files and improving your site’s speed even if you don’t have experience in creating or managing websites.

    Here we cover some of the most important fundamentals of SEO.

    And while these basic SEO 101 tips can help you through the SEO basics of technical SEO, it’s important to realize that search engine optimization is a deep and complex strategy with far more considerations than an SEO guide like this can comprehensively cover.

    If you’re interested in further help in your SEO efforts or would like us to help implement a customized SEO strategy, be sure to contact us for more guidance and expertise so we can help you toward your SEO success!

    Chief Marketing Officer at SEO Company
    In his 9+ years as a digital marketer, Sam has worked with countless small businesses and enterprise Fortune 500 companies and organizations including NASDAQ OMX, eBay, Duncan Hines, Drew Barrymore, Washington, DC based law firm Price Benowitz LLP and human rights organization Amnesty International. As a technical SEO strategist, Sam leads all paid and organic operations teams for client SEO services, link building services and white label SEO partnerships. He is a recurring speaker at the Search Marketing Expo conference series and a TEDx Talker. Today he works directly with high-end clients across all verticals to maximize on and off-site SEO ROI through content marketing and link building. Connect with Sam on Linkedin.
    Samuel Edwards