Technical SEO is split into two main worlds: onsite and offsite optimization.
Onsite optimization refers to all the structures, techniques, and strategies necessary to include on your website, including all your individual landing pages. Offsite optimization refers to everything that happens outside that world, including links pointing to your site, social media activity of your brand and your users, guest posts, and so on.
This guide is exclusively about onsite optimization and improving SEO in areas where you have more direct control. I’ll be writing more about offsite optimization in future guides, but first, I want to explain the essence of onsite optimization, why it’s important, and what you need to do to be successful.
As I mentioned above, onsite optimization is about what you do on your own site. Onsite optimization can be broken down into individual tactics, all of which cumulatively impact your site’s visibility in search engines. Unfortunately, it’s not as simple as flipping a switch, or adopting one new habit—there are many different tactics you’ll need to adopt, in many different areas.
For starters, there are onsite optimization tactics that can be done only once—when you commit the change, you won’t have to worry about it again (at least until something changes or breaks). Others require ongoing attention. Some are structural, impacting the design and layout of your site, while others are qualitative, subjectively evaluating certain elements of your site.
I’ll be posting a quick-reference checklist of all the onsite optimization factors you’ll need to consider at the end of this article, but because these tactics are so diverse in nature, I want to make sure you understand the theory behind them as much as their raw implementation.
This guide is broken down into three main categories of tactics and techniques:
Let’s work on exploring each of these categories individually.
Think of Google as a massive library that offers books for people searching for various topics. The first step to getting your book found is making sure your book is on the shelf—so let’s get it there. Luckily, we’ve put together a complete guide on Google indexation.
Your first job is to make sure that Google’s web crawlers can access your site. Think of these bots as scouts that work on Google’s behalf to scour the web and index information. If these crawlers can’t see your site or can’t access it, Google won’t be able to index it.
There are a handful of reasons why this might be the case:
It’s also worth mentioning that there are multiple web crawlers out there—several specific to Google, and several belonging to other major search engines and tech companies like Bing and Apple. Here are some of Google’s most relevant ones:
Unless there’s something inherently wrong with your site or server, your site should be crawlable. It’s actually harder to stop Googlebot (and other search engine bots) from finding your site. If your site is new, it might take a few days to a few weeks to make it to Google’s index, so don’t be alarmed if you aren’t popping up in search results.
Your robots.txt file is like an instruction manual you can post to search engine bots in your top-level directory. It tells them which pages they should crawl and index, and which ones to avoid. By default, web crawlers index the entirety of your site, but there may be certain pages you don’t want indexed (e.g., pages with duplicate content).
Before doing anything on your site, a bot will check the reference:
www.yoursite.com/robots.txt
This will specify a User-agent and specific pages with a Disallow tag. With the User-agent specification, you can exclude specific bots (see table in previous section) or reference all bots. The Disallow feature will then allow you to exclude any pages you don’t want to be indexed.
As a general rule, you only need to worry about this if you have canonical issues to resolve, or if there’s a page that might interfere with your primary SEO goals. Otherwise, you can leave your robots.txt file blank. Either way, double check your work to make sure you haven’t accidentally precluded all search bots from seeing your entire site—it happens more often than you might think.
One word of advice: don’t try to be sneaky by hiding bad or damaging material. Robots.txt instructions are publicly available information. You can see ours at SEO.co here:
If you’re concerned about the formatting or function of your robots.txt file, you can use Google’s free tester to check it out for possible errors.
Your URL structure can influence how your site is seen and how your pages are evaluated. Google favors sites that have clear, straightforward URLs that make it easier for users to navigate, along with descriptive text that tells Google what the page is about.
It’s static, concise, offers a breadcrumbs trail for the blog, and accurately describes the content of the page with relevant keywords.
For example, our site structured used to include URLs like this: https://seo.co/7-features-you-will-need-succeed-using-images-seo/
Instead of something like https://seo.co/images/
There are actually two kinds of sitemaps you can offer for your site, and both are important for SEO. How important? That’s somewhat debatable, but it’s almost certainly worth the effort to create.
A clean, effective sitemap is critical to optimizing on-page and on-site SEO.
HTML sitemaps exist for users and search engine crawlers, and can usually be found in the footer of a website. It’s a good idea to make sure every page of your site links to this, so having it in the footer is the fastest and most reliable way to establish this.
XML sitemaps are a bit more technical, and you can upload them directly to Google via Webmaster Tools.Just head to the “Sitemaps” section and click “add/test sitemap” in the upper-right hand corner.
If there are any specific issues with your sitemap, Google will let you know.
Here’s a great example of one given by Sitemaps.org (an ideal resource for understanding more about XML sitemaps):
Keep in mind that your site is always changing—you’re almost constantly adding pages, removing pages, or changing pages, so work to keep your site maps up-to-date. If you need some additional help, there are many popular site crawlers available online—one of the most popular is Screaming Frog, which is free for up to 500 URLs.
Do you see anything like this on your website?
That’s bad. All your content should be able to load properly on any device, with any browser, on practically any speed Internet connection. Your content should load directly from HTML (you don’t have to avoid AJAX or iFrames altogether, but the bulk of your content should come from HTML directly), and return no errors when user access is attempted.
The reason for this should be obvious. Google wants to give people actual content—not blank spaces where content should be. Even if it wasn’t a search ranking factor, it would be an important user experience factor, so don’t neglect it.
If you’ve done any significant searching in the past few years, you’ve probably come across something like this:
Note the phrasing of the question and the purported answer, sectioned off from the rest of the search results. This is known as a “rich answer,” and it’s a part of Google’s Knowledge Graph. The Knowledge Graph isn’t a bank of information so much as it’s a network that taps into information on other websites. In this case, my query “how many US citizens are there” prompted Google to find the answer on the Wikipedia page “Demography of the United States.”
Unfortunately, Google can’t do this all on its own—it needs help from webmasters to properly categorize and submit information. For webmasters, this presents a valuable ranking opportunity—it won’t increase your domain authority, but it will give you the chance to have your information posted prominently above traditional results.
The way to categorize your information is through microformatting, sometimes referred to as structured data or structured markup. Basically, it’s a coding format you can use on your site to tell Google how to read information like events, people, organizations, actions, reviews, and many other archetypes. Since it gets technically complex (and warrants an article of its own), I won’t get into the details here, but Schema.org is a leading authority in microformatting, and offers detailed information how to apply it to your site.
This technically isn’t going to help your ranks—at least not directly—but signing up for Google Analytics and Webmaster Tools is essential if you want to gain more knowledge about your site, proactively respond to pressing issues, and learn how your strategies are working. If you have a Google account, you’re already halfway there. Google Analytics will prompt you to create a new site and place a tracking script in your code, and Webmaster Tools will require you to verify your ownership by putting a short verification script in your code or verifying your webmaster’s email address.
I’ve already mentioned some of the onsite insights these tools can offer you, such as crawling your site and submitting a sitemap, and I’ll mention more, such as scouting for duplicate content and evaluating your meta data, but know that there are many more features to explore to improve your site.
Now that you know your site is properly indexed, let’s work on optimizing the individual pages of your site. You’ll have to apply these changes to each page of your site, so be sure to implement them for every new page as you add them.
Let’s talk about titles and descriptions. Check this out:
The above example is a search for “SEO.co” and naturally, we’re the first to appear. Take a look at the sections of the entry highlighted above. The headline, with the embedded link, is the title of this page, while the short description below it is the description or “meta description.”
Titles and descriptions play two main roles in the SEO world:
Accordingly, your titles and descriptions should both exhibit the following qualities for all pages:
With all that in mind, what makes titles and descriptions different?
Just a few things:
While we’re talking about titles and descriptions, don’t forget your header tags. Numbered in sequence (H1, H2, H3, etc.), header tags indicate the main points of your body content—almost like a table of contents. This can help search engines properly understand and index your content, as headers are weighted more heavily than standard body copy.
I’ve already gone over what makes a good URL (in the Indexation section above), so I won’t repeat myself. However, it’s important to remind you that each page should have a properly formatted URL, preferably under 90 characters. Keep this in mind whenever adding a new page.
Your on-page content tells Google much about your page. Though it usually serves as supplementary information to the more-important titles, descriptions, and headers (see two sections above), you shouldn’t neglect the on-page content for any page of your site. At a minimum, you should have 100 words of highly descriptive content. If you can’t offer that, you probably should have a page here.
Content gives you three opportunities:
There are many factors for what’s considered “quality” content—far too many to list here, but these basics should get you started in the right direction.
One quick note—all content on your site should be unique (meaning it doesn’t appear anywhere else on your site). Sometimes, alternative URL forms (like https:// vs. https://) can cause Google to index one page twice over and register that as duplicate content. This is bad news. Fortunately, it’s easy to detect and correct—take a look in Google WMT under Search Appearance > HTML Improvements and you can generate a list of duplicate content instances. From here, you can either use your robots.txt file (see above) to block one instance of each occurring offender, or set up 301 redirects to properly canonicalize your links.
It’s a good idea to include images wherever you can on your site. In combination with high-quality written content, these help convey to Google that you’re a high-authority site dedicated to bringing great content to your users. However, you can’t just stuff images all over your site and expect to rank higher.
There are two main ways images can increase your search visibility:
There are two ways to optimize your images:
For example, take this picture of the Washington Monument:
A good title might be: “This Washington Monument photo illustrates how to optimize an image for SEO”
While good alt text might be: “Washington Monument against sky”
Notice how I’m not stuffing either of these with keywords, nor am I describing something that isn’t there. Something like “Monument SEO best practices and onsite optimization” wouldn’t serve me well (and probably doesn’t serve me well in the body of this paragraph, either).
In addition, your titles and alt tags should follow most of the general best practices I outlined for page titles and descriptions (namely unique, concise, descriptive, and compelling).
You can also optimize your images by making them a proper format (.jpg and .gif are popular standbys) and by making them smaller and easier to download (maximizing site speed—more on that later).
Most of your onsite content should include links to other pages, both internal and external.
Internal links are important because they establish connections between different areas of your site and make it easier for your users to navigate. The more tightly linked your site is, the happier Google will be. As a general rule, no single page of your site should be more than four clicks away from any other page at any other time.
External links are important because they show you aren’t just making things up—they’re your callouts to outside authorities.
For both types of links, it’s important that your anchor text is accurate and descriptive; don’t just name the page you’re linking to, and don’t try stuffing your anchor text with keywords.
Shameless plug: at heart, we are a link building agency that provides direct link building for off-site SEO. It’s one of our expert SEO services.
How your site performs can also factor into your domain authority, which in turn influences how your pages rank in searches. These are generally secondary to factors like your site structure and onsite content, but can influence your final ranks.
Mobile optimization isn’t optional. Last year, mobile traffic overtook desktop traffic for the first time, and it still hasn’t stopped growing. If that wasn’t enough incentive, Google then released its “Mobilegeddon” update to reward every site with a fully functioning mobile site and punish those without one.
Optimized for mobile means your site:
There are a few ways to achieve this, but by far the easiest and most popular is through responsive design. Responsive design automatically flexes a site to accommodate any device that accesses it.
A good example is CNN’s site:
Look how the content is basically the same, but “stacked” on the mobile version to make it easier for mobile users to access. If you’re in doubt about whether your site is considered “mobile-friendly,” Google offers a free test you can use to find out.
Your site should be up most of the time. If it ever goes down, due to a server issue or maintenance, you should be aware of it and work quickly to restore it to normal. This should go without saying.
404 errors offer a bit more flexibility; these come into existence when one of your pages no longer exists (usually because it was deleted, renamed, or moved). 404 errors don’t hurt your ranks directly, but they can cause you some user experience woes—for example, if a user follows an old link or sees an old indexed page but only finds a 404 error, they may leave and never return.
There are two easy ways to “fix” a 404 error, and both are welcomed by Google:
That being said, there are some instances where leaving a 404 error alone is the best option, such as when the page is simply no longer relevant to your brand.
Site speed wasn’t as big of a ranking factor in the past, but that is quickly changing. Site speed is critical. The faster your site loads, the happier your users will be—giving you a ranking bonus as well as a brand reputation bonus.
There are several ways to speed up the performance of your site, including:
Site speed is especially important for mobile users, as most mobile devices offer slower loading times than comparable desktop connections. Mobile users also tend to be more demanding, so every second here counts.
Keeping your site secure won’t give you much of an extra ranking boost, but it will be valuable to your users. Opt for SSL encryption (you can tell you have this by the “s” in https://), and your users’ data will be more secure. Still, https is a ranking signal, and it may grow in power as the years go on. You can purchase an SSL certificate through your hosting provider.
Before concluding this article, I want to mention something about CMSs. Most modern CMSs, including the ever-more-popular WordPress, offer built-in SEO features, some of which claim to optimize your site on your behalf or “automatically,” and others of which present these options in easier interfaces, such as allowing you to type in your titles and descriptions rather than embedding them in code.
Most of these tools are valuable time-savers, helping you reduce your margin of error and get your work done faster and more efficiently. However, don’t make any assumptions. It’s not enough to assume that your CMS “took care” of something for you. Run the tests yourself and don’t be afraid to dig into the code of your site.
I’ve given you a lot of information, so to make things easier, here’s an “ultimate” checklist you can use to make sure you’re optimizing your site effectively (split into sitewide and page-level sections).
Print it out and keep it handy:
Onsite searches can be highly valuable for improving your user experience, helping you understand user needs and behavior, and ultimately facilitating conversions on your site. While potential customers may be able to find your site easily, if they can’t find what they’re looking for on your site quickly, they may leave before they have a chance to interact with you.
Setting up an onsite search is relatively simple, and you might already have one prepared. However, perfecting your onsite search approach is a different, more intensive issue. Like with any marketing or user experience initiative, there is always room for improvement, and improving your onsite search function could significantly increase your onsite conversions.
The central premise is identical—to use an onsite search function to find something—but there are actually three different types of onsite search. Each one caters to a different type of user with a different intention, and you’ll have to change your search function to best suit its ideal type.
Destination-Oriented
Destination-oriented searches are all about getting somewhere. It could be a returning user looking for a specific page, or a social follower looking for a recent post. In any case, this search function needs to display the most relevant result as quickly as possible.
Information-Oriented
Information-oriented searches don’t have a destination in mind. Instead, they’re focused on getting to a page that discusses a certain subject. These types of searches require a function that displays lots of results throughout your site, starting with the most relevant to the user’s query. Finding the “perfect” match isn’t the primary goal; finding multiple viable options is.
Product-Oriented
Product-oriented searches are exclusive to e-commerce platforms, and incidentally, e-commerce platforms have the most to gain from improving their onsite search functionality. These searches involve a customer searching for a specific product or service, and the type of search results you display in response could dictate whether or not the customer eventually purchases from you.
The placement of your search bar will dictate how many people use it, and how easy it is for them to find it. The ideal user will want to use your search bar, look for it, find it immediately, use it, and get to their intended destination. Any break in this cycle could compromise your ability to ultimately convert that user. To make sure your search bar can be found easily, place it somewhere in the upper-right hand corner of your site; this is where most users initially look. Also be sure that the search function shows up on every page of your site.
Predictive search features aid the functionality of your search bar considerably. If a user isn’t sure what he/she is searching for, or if he/she only knows a piece of the information necessary to perform a search, a predictive search populating function can fill in the rest of the puzzle. This can also be extremely useful in helping users with typos or misspellings. If a user returns no results for a mistyped query, he/she may leave, but if your search bar corrects the query, you can avoid the problem altogether.
Filters are great for product-oriented searches, but you may not find it useful for other types of onsite search. Filters are essentially options that your users can toggle on and off when searching for a product. For example, if a user searches for “shirts,” pop-up filters can allow the user to refine that search based on shirt size, price range, gender, style, and color. The type of filters you include will vary based on your industry and the types of products that are most popular on your site.
This is especially useful for product-oriented searches, but any search that features pages or products in categories and subcategories can benefit from it. For example, if a user starts by searching for a category of products, then drills down into a subcategory, a breadcrumbs-style mini navigation at the top of the search results page can help the user get back to the beginning of the process easily. It can decrease bounce rates and recover possible sales when the user doesn’t immediately find what he/she is looking for.
Semantic search is a sophisticated search function that analyzes the intent behind a user query rather than analyzing the keywords they input at face value. This is becoming increasingly important as fewer people rely on keyword-based searches and more users rely on full phrases. If you can develop a semantic search functionality into your onsite search function, you’ll be able to give more accurate, relevant results for long-tail user queries.
Once you’ve got your search functionality near-perfected, you can start reviewing the fruits of your labor. In Google Analytics, you can easily set up monitoring for your onsite search history. Once that’s in place, you can review onsite search trends including popular user queries, bounce rates post-search, and how many people ended up converting after finding what they were looking for. You can use this data to further enhance your search features. Your onsite search function is more important than you might have realized, especially if you’re running an e-commerce platform. Make whatever improvements you can, whenever you can, and keep a close eye on your data to determine what changes you’ll need to make in the future..
Blogging remains an indispensable online marketing tool. Neglecting to include a blog in your online marketing is nearly out of the question. With a blog, you provide continuous value to your users, offer solutions to their problems, and display your expertise to a potentially broader audience.
Blogging is also an excellent tool for SEO.
Google and most of the other search engines love blogs. When you optimize properly for a blog for search, it tends to rank almost immediately. The search engines especially love blogs that are constantly updated with high quality and relevant posts. Recall that quality and content freshness play huge roles in search engine rankings.
Fortunately, it’s quite simple to rank a blog post. By following and implementing best practices for on-site optimization, you can boost your blog’s ranking. The key is to place keywords within certain proper areas in each of your blog posts.
After determining the right keywords for your next post, keep in mind the key blog post sections below. They describe the areas where keywords should be placed.
The title (H1) is arguably the most important section to be optimized for keywords. The title alerts not only the users about the content the post; most especially, it alerts the search engine spiders. A rule of thumb for optimizing titles for keywords is to use the keywords early in the title and avoid repeating the same keywords within the title.
Also, make the keywords read naturally. Don’t be too concerned about exact-match keywords: the search engines are working to deliver accurate search results based on latent semantic indexing. Consider the title “Roofing Los Angeles Tips for Homeowners,” which is optimized for the keywords “Roofing Los Angeles.” That doesn’t read that great if you compare it to “Roofing Tips for Los Angeles Homeowners.” Ultimately, the former stands a good chance of getting slapped by Google for spamming and quality issues.
Meta descriptions are no longer used to determine a site’s ranking, but they’re still useful for web surfers. Google and other search engines use the meta description as text displays in SERPs to offer users a sort of preview (not to be confused with Rich Snippets) with regard to what the page is about.
So take advantage of the meta description tags by including the keywords your post is optimizing for. Also, it’s highly recommended that you use copywriting elements when writing meta descriptions to entice users to visit the site.
Assuming the focus of your blog post is just one general topic, you should stick to just one or two keywords, and identify the second keyword as a secondary keyword. This allows you to maximize traffic that is funneled to your post via the search engines. Focusing on several keywords diminishes the value of your post, and could raise spam flags.
It’s recommended to use keywords early in the post, in the middle, and in the final paragraph. Be careful not to over-optimize your post with the same keywords. Try to diversify the use of keywords by also utilizing related terms.
Also, when using keywords as an anchor text to point readers to a related post within your blog, try to avoid overuse of exact-match keywords. Diversify anchor texts by using related terms as hyperlinks.
Adding images to blog posts can make them a lot more engaging. It’s also good for SEO. Images on posts can be keyword-optimized by adding alternate text. This is useful for those times when the image doesn’t load on your visitor’s browser.
Most importantly, alternate texts are embedded within your page’s HTML codes. These texts are picked up by Google’s spiders and are added to the index. Most blogging platforms provide tools to make it easy for webmasters to add alternate tags to images. However, if your blogging platform doesn’t offer such a feature, check with your web developer.
Google loves to use rich answers—the bits of information you sometimes see immediately in search results, above and apart from traditional SERP entries. Rich answers are growing in prominence and importance, but Google relies on others to get the job done. Microformatting is a backend markup that feeds Google this information in digestible chunks, establishing a universal language that all webmasters and bots can follow. If you aren’t using it, you’re missing out on some serious potential search visibility (and leaving your users with less information, accordingly).
Historically, there have been some significant changes to how onsite optimization works—for example, a decade ago, it was neither imperative nor even appropriate to optimize your site for mobile devices. Today, having a non-optimized mobile site is archaic, and can significantly stifle your potential growth. However, by and large, most onsite optimization factors have remained consistent.
(Image Source: SearchEngineWatch)
The bottom line for onsite optimization is that it sets your site up for the search engine rankings you want.
So why are we on the verge of a potential disruption in the world of onsite optimization? There are three factors working together here:
With that being said, let’s explore some of the potential game-changers in the onsite optimization world, some of which could start having a massive effect on how we optimize websites as early as this year.
The first and potentially most significant trend I want to explore is the development of app-based SEO. Obviously, apps have permeated our society thanks to the popularity of mobile devices and the convenience of app functionality. Since apps don’t require the intermediary step of firing up a web browser, they’re becoming a more popular means of discovering online content and using online-specific functionality.
What does this have to do with onsite SEO? Everything.
First, it’s important to acknowledge the amount of app SEO already relevant to today’s users. Apps are starting to serve as an alternative to traditional websites, occasionally offering what websites can’t, but more often offering what websites do, but in a more convenient, device-specific package.
The fundamental crux of app SEO is optimizing your app to be indexed by Google (and other search engines), much in the same way that onsite optimization ensures your website is indexed. For most apps, this involves setting up communication between your app listing and Google’s search bots, so Google can draw in information like your app name, a simple description, an icon associated with your app, and any reviews. Google can then provide your app (along with an “install” button) in SERPs whenever a user types in a relevant query.
(Image Source: Google)
There’s also an app SEO feature known as “app deep linking,” but I’m hoping there’s a catchier name for it in the near future. This functionality allows you to structure links that point to interior pages or screens of your app, giving Google the ability to link to those pages or screens directly in search results.
(Image Source: Google)
There’s one limitation to this process: users must have the app already installed to see these deep links in their search results. But there’s a solution in beta!
Google’s latest brainchild is a functionality called “app streaming,” which allows users to access deep linked content within apps, and sometimes entire app functions themselves, without ever downloading the app to their devices. The premise is somewhat simple; Google hosts these apps, and allows users to use only the relevant portions of them, much in the same way that Netflix streams movies and shows as you’re watching them.
(Image Source: TechCrunch)
The concept is even expanding to advertising, which is great for companies that revolve around the use of mobile apps. Companies may allow for an in-results “trial” offer of their apps, giving users a chance to stream the app before they buy it:
(Image Source: SearchEngineLand)
So what does all this mean? It means that apps are developing their own “kind” of onsite optimization, unique from what we’re used to in traditional websites. For now, it might seem like a gimmick, but there’s reason to believe this change could be coming to all of us, sooner than we might think.
The most important factor to remember here is the way consumer trends are developing. Mobile traffic has rocketed past desktop traffic, and there’s no signs of its momentum stopping anytime soon.
(Image Source: SearchEngineWatch)
App adoption is also on an upward trend, correlating strongly with mobile traffic data (as you might have predicted). Because of this, users will demand more app functionality in their search results (however those results might be generated), and search engines will do more to favor apps.
The most important question for this section is whether all these fancy app SEO features and rising app use could eventually replace traditional websites altogether. Conceptually, apps are just “better” versions of website. They’re locally hosted, so they’re somewhat more reliable, they offer more unique, customizable experiences, they can be accessed directly from your device, sparing you the intermediary step of using a browser, and there’s nothing a website offers that an app can’t.
But just because apps “can” replace traditional websites, it doesn’t mean they inevitably will, especially with older generations who might be reluctant to adopt apps over the traditional websites they’ve known throughout the entire digital age. Still, even if apps don’t replace traditional sites entirely, they’ll still be significant players in how SEO develops in the future.
As a related note to this discussion, you may be wondering if your business “needs” to adopt an app, since they’re becoming so popular and influential in the SEO realm. The answer, currently, is no. Traditional websites are still used by the vast majority of users, and the cost of developing an app is often only worth it if you have a specific need for one as part of your business model, or if there’s significant consumer demand.
On another front of development are rich answers, sometimes referred to as instant answers, or Knowledge Graph entries. These are concise answers that Google provides users who search for a simple, answerable query, and they come in a variety of forms. They may be a few lines of explanatory text describing the solution to a problem, or a complex chart, calendar, or graphical depiction, depending on the nature of the query.
Take a look at these examples:
Note how the answer in the bottom example contains a citation, with a link pointing to the source of the information. Google draws all its Knowledge Graph information from external sources, and if yours is one of the contributors, you’re going to earn this visibility. Since users are getting the answers they’re looking for, you may not get as much traffic as an ordinary top position, but you will be the most visible in the results.
The most important optimization influencer here is the sheer increase in how many rich answers are provided. Google is developing rich snippet functionality for SEO at a fast rate because it understands the sheer value to users—getting the answer you wanted, immediately, without ever having to click a link, is the next generation of search engines. Just in the past year, there’s been a massive surge in the number of queries that are answered with rich answers, corresponding with Google’s increasing ability to decipher and address complicated user queries.
(Image Source: StoneTemple)
Personal digital assistants, too, are capable of providing more direct answers to users. So what does this increased ability to provide direct information mean for onsite optimization?
The first possibility is that structured data might become a ranking signal. Google and other search engines depend on websites to use a specific architecture, a structured markup, to provide information that can be used for such answers. Schema.org is a great resource for this, and even amateur coders can implement this markup on a site in relatively little time. Accordingly, Google may start rewarding sites that offer more completely adherent pages, or ones that offer better information.
John Mueller addressed this recently:
(Image Source: SearchEngineLand)
There are a handful of factors to consider here that complicate the relationship of onsite optimization to rich answers:
The bottom line here is that directly provided answers are morphing the traditional SERP, the average user experience, and are changing what it takes for your site to be perceived as an authority.
The bottom line for search engines is to make users happy, and they’re going to evolve as they learn more information about what workers want and need. Technologies are becoming advanced enough to draw in big data about huge swaths of users; this will soon make it possible for Google and other search engines to learn even more about how their users interact with sites. This, in turn, will force webmasters to adopt more onsite changes that favor beneficial user experiences.
Currently, user behavior serves as a peripheral ranking factor; longer time spent on page is a general indicator of a high-authority or otherwise high-value site, while higher bounce rates is an indicator of much lower authority. In the near future, Google may be able to look at even more specific usability factors as ranking signals, such as how quickly they scrolled through the site, whether or not it appeared as though they were reading content, and in what order they clicked your links.
User engagement factors may similarly come into play. For example, how quickly a user moves to leave a comment on your blog, or what other apps the user connects to may indicate how authoritative your site is.
These new features, combined with other applications of big data, will make onsite optimization more qualitative in nature. In addition to hitting the mark with the “fundamentals” (some of which are described in the next section), your site will be required to qualitatively please your user base, which will require significant testing and adjustment. For some webmasters, this is nothing new; it’s what’s required for conversion optimization, but soon, search engines may demand it.
So far, I’ve mostly been exploring how new technologies and trends will influence the development of new additions to the onsite optimization world. But what about the onsite optimization strategies that already exist? How are they going to be affected over the next few years? Will they remain the same? Disappear? Evolve? I want to take a quick look at some of the most important factors, and how they might develop with the times:
In an effort to stay ahead of the competition, you need to remain vigilant and keep watch for how these on-page trends develop. Overall, the changes in onsite optimization will reflect a change in the role of traditional websites in general. In the next few years, this change will manifest in three key areas:
It’s hard to say exactly when or how these changes will develop—app-based SEO is already alive and well, and companies are starting to take advantage of it for their businesses, but we’re not in any immediate danger of traditional websites going extinct yet. Technology tends to develop faster than most consumers and business owners anticipate, and you certainly don’t want to get left behind, so err on the side of caution by hedging your bets.
Invest in select new strategies you feel are pertinent for your site’s visibility, but don’t be too quick to abandon your old techniques. If I had to guess, these changes will probably manifest gradually over the next five years, so you have plenty of time to make your evaluations.
With that, you should have everything you need to get started with onsite SEO (or check to ensure you’re doing everything properly). If you can cross off all the items on this checklist for all of your pages (and keep that quality consistent as your site grows and changes), you can consider your onsite work nearly complete. Beyond these factors, your greatest concerns should be user experience and ongoing content quality—but those are topics for another post.
If you operate your own SEO agency and are looking for assistance with your next client project, please contact us about our white label SEO service where we provide everything from audits to content.