This is an ambiguous statement, and it might not make sense to someone who isn’t intimately familiar with web development. The basic idea here is this; just as there are an infinite number of paths from point A to point B but only one “optimal” path, there is an infinite number of ways to code any function, but some are more efficient than others. Unnecessarily complicated code has a number of disadvantages, including slower site loading times and more legwork for search engine crawlers, so take the time to “clean up” your code.
This step may or may not apply to you, depending on what your intentions and goals are in the indexation of your site. Ordinarily, search crawlers will track down and index every page of your website, but you can change this based on instructions you give those crawlers in what’s known as the robots.txt file of your site. Here, you can block crawlers from indexing certain pages—which is ideal if you have intentionally duplicate pages or other content you don’t want search engines to see. Just don’t use this to try and cover up black hat tactics—Google will find out.
(Image Source: Robotstxt.org)
Your site isn’t going to be up 100 percent of the time. You’re going to have server crashes, and your pages will occasionally be prone to individual errors. This is a reality of modern web development. All you can do is keep a close eye on the status of your servers, and respond to errors as quickly as possible to keep your domain up and running.
If you’re not familiar with dynamic versus static URLs, this terminology may seem strange to you. It’s easier to describe dynamic URLs first; these are URLs that provide different content depending on the nature of the query to the site’s database. Static URLs, by contrast, only change if someone manually makes a change to the site’s backend code. With very few exceptions, your site’s URLs should all be static, only changing when you push manual changes to them. This is generally a more trustworthy practice, and will help keep the authority of your domain and individual pages high.
You should also keep your URLs logically organized by using a breadcrumbs trail. In the realm of website development, breadcrumbs trails are strings of sectioned-off extensions to the end of your URL. For example, you may list out the categories and subcategories where a page is located. For example, you might have example.com/maincategory/subcategory/page instead of just example.com/page. This gives you the opportunity to optimize for more keywords, provide a more convenient user experience for your customers, and give more information to Google about how your site is organized. There’s no reason not to do this (and it happens automatically for most template-based CMS’s like WordPress).
For the same reasons that you shortened your domain name, you should shorten your URLs. This is as much for your own benefit as it is your users’, as it’s going to make organizing your site much easier. For example, if you have a “products and services” subcategory page, consider shortening it to just “products” or “services.” If you have a long blog title like “how to recover from an embarrassing situation at work,” consider shortening it to “embarrassing-work-situation” as an extension of your URL. Remove any unnecessary additions or extensions whenever possible and focus on what really matters. I realize this seems to counter-act my advice from #12 (adding a breadcrumb trail increases the length of the URL), so to be clear, what I suggest is using breadcrumb trails and keeping them short and concise, while also making an effort to keep URLs short after the inclusion of the breadcrumbs.
An HTML sitemap is a way to organize your site easily for users—not to be confused with an XML sitemap for SEO, which I’ll cover in the next bulleted tactic. Here, your goal is to make a comprehensive list of all the pages of your site, organized logically so users can follow it—and follow its links to those specific named pages. Generally, webmasters include a link to the HTML sitemap in the footer, where users intuitively seek to access it.
An XML sitemap is a more technical version of the HTML sitemap, marked up with code so that search crawlers can make sense of your data. Creating one is easier than it seems, and some WordPress plugins do it automatically for you. When you have your XML sitemap complete, you can upload it to Google Search Console to instruct Google about the exact layout and structure of your website. Note that Google will crawl and interpret your website without this sitemap, but this can accelerate and increase the accuracy of the process.
(Image Source: Sitemaps.org)
Your site is going to go through changes, whether you currently know what those changes are or not. You’re going to add pages, remove pages, and possibly restructure entire swaths of your site. When this happens, it’s easy to forget about updating your sitemaps—so establish a reminder to keep your sitemaps up-to-date. Forgetting this won’t crush your rankings—Google will eventually catch up with what you’ve done—but it’s a way to help your web strategy run smoother.
This is a major step of the process; make sure that all of your content is loading, correctly and fully, on every possible device and browser. Most web developers go through a testing process to see how your site looks, but are they using older versions of their browsers? Different browsers? Different devices? An image that doesn’t load on Internet Explorer could make your page less authoritative due to “broken content.” You can use a service like BrowserStack to help you out here.
You also need to optimize for mobile devices. The majority of all web traffic now happens on mobile devices, so it makes sense from a pure user experience perspective, but it’s also important for Google’s consideration of your site (thanks to the Mobilegeddon update and several algorithm changes before it). Thankfully, Google offers a free test that will tell you not only if your site is mobile-friendly, but what’s wrong with it if it isn’t. Just keep in mind that mobile optimization is about more than just meeting the minimum requirements of Google—it’s about giving the best possible experience to your mobile users. For more information, please check our comprehensive guide on Mobile SEO.
You can also improve your navigation bar to improve your search rankings. Google takes user experience seriously; the search giant rose to dominance because of its commitment to connecting users to the best possible content for their queries. Google wants users to have a convenient, straightforward, interpretable experience, and part of that includes being able to navigate the site easily. Organize your site into categories and subcategories, and make your menus accessible and easy to click. This may seem like a simple feature, but it’s one that’s commonly neglected and much more important than most people realize, because of the way PageRank ‘flows’ throughout a site. Try to put only your most important pages in your navigation; they’ll be the ones that get a significant ranking boost.
Google’s Knowledge Graph is continuously growing in size, able to answer more user queries with short, concise answers pulled from sites across the web. How can you get your information featured in these boxes, which automatically take visibility priority over organic search results? The key is to use structured markup, organizing your site’s content in a way that makes sense to search engines. org has plentiful tutorials to help you figure out exactly what to implement and how to implement it—you just have to take the step of committing it to the back end of your site.
The navigation of your site is partially dependent on how your internal pages link to one another. For example, it might be easy for a user on your homepage to jump to whatever page is most relevant for him/her, but can he/she quickly and easily jump between pages to explore your site further? Try to include at least one link to another page on your site within every page you develop; some of your blog posts might have several or even many links to other pages on your site. Internal linking won’t just increase your search rankings; it will keep your users engaged on your site for longer, which increases the likelihood of a conversion.
Internal links are just the beginning—it’s also a good idea to link out to other external sources to back up the information you present. For example, if you’re referencing a statistic, fact, or other piece of specific data, it’s important to cite the source you got it from. Doing this also adds to the trustworthiness of your site; it shows that you’re not just making information up, and that you have verifiable primary and secondary sources to vouch for you. Just make sure you’re choosing high-authority sites, as linking out to low-authority sites could have the opposite effect.
It’s good to have images throughout your site, whether they’re entries in a photo gallery or supplementary material for one of your blog posts. However, not just any images will do; some images are better than others when it comes to suitability for the web. For example, some formats may not load properly on some devices, and others may drag down your loading speed. As a general rule, formats like JPG, PNG, and GIF are reliable choices. Beyond that, you’ll want to make sure your images are reduced to a smaller size to keep your site speed as fast as possible. For more information, please visit our comprehensive guide on images & SEO.
Going beyond the simple formatting of your images, you can also optimize them with text and descriptors to increase their chances of appearing in Google Image search. This won’t have a direct bearing on your domain authority or general SERP rankings, but can give you another outlet for search optimization. First, give your image an appropriate title; keep it short and simple, but relevant to what’s happening in the image. Then, include an alt tag (which isn’t a literal “tag”) that describes the image in more detail. Think about what a user would search for to find this image.
(Image Source: Yoast)
Next up, you’ll want to improve the performance of your site. The shorter your page loading time is, the better, and even a fraction of a second can bear a significant improvement. This isn’t as big of a ranking signal as some of the other factors on this list, but it is worth optimizing for—especially because of its peripheral benefits. When a user clicks through to your site, he/she will make a decision of whether to stay within seconds of arriving. According to KissMetrics, 47% of consumers expect a web page to load in 2 seconds or less, and 40% of people abandon a website that takes more than 3 seconds to load, while just a 1 second delay in page response can result in a 7% reduction in conversions. Make sure your content loads within that timeframe, or your bounce and exit rates will suffer—even if you’re sitting on a top search position. You can conduct a site speed test on your website using this tool from Pingdom.
This is a small ranking signal, but it’s worth optimizing for in part due to its surprising simplicity. Google introduced SSL encryption, a way of securing the information on your site, as a ranking signal back in 2014, and it may increase in significance as the years go on. Contact your hosting provider, and you can apply this encryption for a small additional fee, earning you the “HTTPS” designation and making your site more secure. Even if you don’t do this for the search rankings, it can keep your customers’ information safer. Note that I haven’t experimented with switching established domains/websites to https, as I’ve seen anecdotal reports of websites doing so and losing significant ground in the rankings. That’s why SEO.co hasn’t been switched over (our search traffic is great, and I don’t want to imperil it for a shot at marginal improvement). With that said, I would recommend any new website or domain to utilize SSL encryption. It may also be more important for websites that transmit data frequently, such as e-commerce sites where users must login and input personal or credit card information to complete a transaction.
Google hates to see duplicate content, for somewhat obvious reasons. If a chunk of text already appears somewhere on the Internet, why does it need to exist again somewhere else? Plus, it’s sometimes an indication of plagiarism. However, it’s possible (and, in fact, quite common) to have duplicate content on your site even if you’ve never plagiarized a word; sometimes Google indexes two separate versions of a single webpage, such as the HTTP and HTTPS version, leading it to “see” duplicate content where there isn’t any. You can use Google Search Console or a third-party tool such as SiteLiner to quickly and easily check for these errors and correct them by eliminating one version of the page.
Sometimes, there’s actually a justification for having duplicate content on your site. For example, you might be running two distinctly designed versions of a page that has identical content between those versions. If this is the case and you don’t want to be brought down by any duplicate content issues, your best bet is to use rel=canonical tags to resolve the issue with Google. These tags instruct Google which page should be categorized as the “canonical” or official version of the page and which one should be ignored; note that this is distinct from using the robots.txt file to ignore one page completely.
Next, you’ll want to make sure all of your content is well-organized in categories and subcategories. Create an ongoing list of your main blog topics, and assign at least one of those categories to each blog. Google is able to see this information and use it to figure out what your content is about; it’s also a valuable opportunity to showcase some of your target keywords and phrases.
This isn’t a huge ranking factor, but it’s something Google takes into consideration—plus, it’s a general best practice for optimizing a user experience. You should offer prominent contact information throughout your website, preferably with at least one obvious means of contacting you (such as a phone number in the header of your site). You’ll also want to create a designated contact page, with your company name, address, phone number, social media information, and a contact form at a minimum.
Google Search Console is a goldmine of information about how your site is performing and how it looks in search engines. It’s a Swiss army knife of diagnostic tools you can use to proactively identify any issues with your site that could interfere with your other ranking efforts. For example, Search Console can send you an alert when your site goes down, or you can get a first-hand look at how Google is currently indexing your site, making note of any erroneously indexed pages. Check this information often to stay on top of your site’s development.
This is especially important if you’re an e-Commerce platform selling products online. Give your users a voice by offering up customer reviews on various pages of your site. You can offer them the ability to give you a star or number rating, but the big draw here is giving them a platform to write their thoughts. This is a way of capitalizing on user-generated content (which will naturally be optimized for the types of products you sell), but you can also use microformatting to increase the chances that these reviews could be featured in SERPs directly.
On the surface, bounce and exit rates may seem like the same metric, but as explained by Google below, they’re actually distinct. Neither is a good indication of user experience; both imply that a user has left the site after visiting this particular page. A high bounce or exit rate could imply that the content on the site is unsatisfactory, and could play into how Google measures the relevance or authority of that page. Try to improve these rates by offering more unique, valuable content, and by keeping users engaged for a longer period of time, such as by offering longer, more in-depth, valuable content.
(Image Source: Google)
The good news is, by decreasing your exit and bounce rates, you’ll likely increase the time a user spends on that page of your site by proxy. You won’t have to do much else to increase the time spent on each page of your site. Google takes time duration as an indirect measure of the value of the content of a page; for example, if you have a blog post that averages 30 seconds of visit time versus one that averages 10 minutes of visit time, the latter is clearly a superior piece.
For the most part, SEO is about attracting people to your site who have never heard of your brand before; optimizing for commonly searched queries is a way of getting in front of people who have otherwise never heard of you. However, it’s in your best interest to optimize for repeat visitors as well; publishing new updates frequently, encouraging users to come back for daily or weekly specials, and rewarding repeat customers with accumulating incentives can all help your strategy thrive.
Not all companies will want or need to pursue a local SEO campaign; however, it’s crucial for businesses who have a brick-and-mortar presence and rely on customer foot traffic. Google’s local algorithm works differently and separately from its national algorithm, identifying the three most relevant and authoritative local businesses for a given query when it detects a local-specific indicator in what’s called its “Local 3-pack.” Chances are, Google will already know your location based on your business’s address and your presence in local citations (more on those later), but it could also be advantageous to optimize various pages and content entries of your site with local-specific keywords, such as the name of your city, state, or region. For help getting your business in the local 3-pack, our local SEO guide.