Do you want higher rankings in the search engines? If you’re managing your own search engine optimization (SEO) campaign, you need a checklist to make sure you don’t skip any components. SEO is a highly complex art that involves strategizing, tracking, in-depth research, testing, and more testing.
The checklist items outlined in this article will help you establish a strong foundation for SEO with the right applications, the necessary foundations, and the essential on-site SEO tasks you need to perform.
The first thing you need to do is set yourself up with the right SEO software applications. While some of the following applications overlap with some functionality, they each have a unique and specific purpose for SEO.
Formerly known as Google Webmaster Tools, Google Search Console is a free tool that helps webmasters monitor Google search rankings and troubleshoot certain technical problems. For example, you can see your web pages that rank high, but aren’t getting clicks. From there, you can use strategies to identify why you’re not getting clicks.
If you’re not familiar with the tricks used to identify and troubleshoot ranking issues, Ahrefs published a phenomenal, detailed explanation of how to use Google Search Console to identify and fix ranking issues.
Google Analytics will give you a plethora of marketing insights you can use in your SEO campaign. In general, Google Analytics can tell you which marketing channels will serve you best and which to avoid. You can also use this tool to narrow down further details about your target market.
The SEO-specific insights you’ll get from Google Analytics include:
Google Analytics is a free tool and you can sign up to get it here.
Although Google dominates the industry, Bing is a popular search engine. Bing owns 36% of the desktop search market in the United States. Also, Bing powers the voice search features in devices like Alexa and Cortana.
You need to make sure your websites are optimized for Bing. Like Google, Bing has its own free toolset for webmasters called Bing Webmaster Tools.
There are several key features provided by Bing Webmaster Tools, including the ability to submit URLs to be indexed in Bing and to request that Bing ignore certain URL parameters.
You can adjust your site’s crawl rate, which is great if you have heavy traffic during certain hours and want to schedule Bing’s crawl bot to crawl your site during off-peak hours.
There are a ton of other features pertinent to your SEO efforts, including:
For a full overview of what you can do with Bing Webmaster Tools, read this thorough guide published by Search Engine Land.
Even if you’ve already installed Google Analytics on your website, you need an independent analytics platform you can control.
Matomo offers several key features that you can’t get with Google Analytics, including:
Google Analytics is important to use, but don’t rely solely on GA for all of your SEO tracking needs. Matomo will fill in the gaps.
Semrush is great for finding low competition, high traffic keywords, which are the ideal keywords you want to target. If you don’t have any keyword ideas to start with, you can use the Semrush Organic Research tool to find your competitors’ keywords.
Filter the list of keywords in various ways to find low competition keywords. You can save up to 1,000 keywords to track at any given time.
Semrush offers plenty of other useful tools and reports that you’ll need on your quest to dominate the search engines.
WordPress is a great content management system, but sometimes it’s hard to control certain elements without additional applications (plugins). For example, with Yoast, you can control how your home page title appears, add meta descriptions, and specify post category taxonomies.
Yoast is easy to install and easy to use. The options for each page and post appear at the bottom of the post editing area, so you can apply settings globally or individually.
Before getting started with your SEO tasks, make sure you have the four pillars of SEO covered. Those pillars are:
All of these pillars will form the foundation of how you implement your SEO tasks. For example, you can’t optimize your page titles and descriptions without knowing what your market wants. Likewise, you can’t write SEO-optimized content without a list of targeted keywords.
Once you have your 4 pillars covered, you can perform the SEO tasks outlined in the second half of this article.
Keyword research is the process of discovering search terms and phrases people type into search engines related to your product or service. The goal is to come up with a list of keywords to target with your SEO campaign. While most people have gotten used to using Google’s Keyword Planner, it’s not a good keyword research tool.
Keyword research requires extensive research beyond what you’ll get by using Google’s Keyword Planner. GKP is a great tool for generating ideas for keywords and phrases, but it won’t give you accurate search data. The data provided is often general and vague.
There are 5 basic categories of data that make keyword data useful:
These five types of data aren’t easily obtained from Google’s Keyword Planner, if at all. For example, GKP search volumes are estimated, not exact. Even so, you won’t get any keyword data for products since advertisers aren’t able to bid on product keywords to be visible in Google’s shopping results. In other words, you can only research keywords used on the advertising platform.
One eCommerce business owner did some experimenting and discovered something that exposed the harsh truth about Google’s Keyword Planning tool. The business owner ran text-based ads and a shopping campaign for the same product. There was a huge discrepancy between the actual and estimated search volume for certain keywords between the two campaign reports.
Furthermore, the Adwords report listed an 8-word keyword phrase getting nearly 500 impressions per month. However, that phrase wasn’t even listed in the Google Keyword Planner tool. If that business owner relied on the GKP tool to generate keyword phrases, they would have missed out on that highly successful 8-word phrase.
The most important keyword research data you could possibly get are low competition keywords. Unfortunately, as the eCommerce business owner discovered, you can’t get that information from Google’ Keyword Planner.
If you want to get results in the search engines, you need to go after the low-hanging fruit. The low-hanging fruit often includes long tail keywords like the 8-word phrase that wasn’t provided in Google’s Keyword Planning tool.
Low-hanging fruit is more valuable than you think. It’s even more valuable than keywords with a high search volume. Keywords with a high search volume tend to have more competition. In other words, it’s generally harder to rank for keywords with a high search volume. Although you can rank for highly competitive keywords with a large marketing budget, most people can’t.
Keep in mind that SEO advice varies and you’ll get different numbers and ranges from other people, but in general, a keyword is worth ranking for when it gets a minimum of 50 searches per day. Now, that may not seem like enough searches, but if the particular keyword is highly targeted or is a buying keyword, those 50 daily searches have the potential to turn into 50 daily sales.
Not all low-hanging fruit keywords are worth ranking for. Knowing the difference between the 3 basic types of keywords is the secret to choosing the right low-hanging fruit.
There are three categories of keywords:
General keywords. These are extremely basic keywords like a product name, a product function, brand names, and general nouns. For example, cars, hair removal, Kleenex, and electric bicycle are all general keywords.
Buying keywords. Buying keywords are what people type into the search engines when they’re actively looking to buy a product. For example, San Francisco chiropractor, cheap deck screws, buy potatoes online, and used cars under $5,000 are all buying keywords.
Research keywords. Research keywords are keywords people search for when they’re still researching a product or brand and haven’t decided to buy. For example, research keywords include phrases like:
“Buying” keywords are the most lucrative in the immediate term because people are already looking to buy. If you can rank for buying keywords, and you have effective sales copy, your ranking will be profitable.
Ideally, you want to target the low-hanging fruit for buying keywords because they have more potential to produce a sale. You don’t want to go as low for the other types of keywords unless you plan on creating a ton of content for educational purposes and utilize remarketing to keep drawing visitors back to your website.
You can’t get started with SEO until you complete a decent amount of market research. Market research tells you who your target market is in terms of demographics along with what motivates them. No matter how basic your product is – like a frying pan – you need to target a specific market or niche.
Once you have enough information about your market, then you can choose the best keywords to target. Remember, not all of your keywords and phrases will be related to your product features. Many of your keywords will be related to the problems people want to solve with your product. That’s why having a specific target market is important.
You won’t reach the correct market without market research. Market research will help you narrow down specifics about your market. For example, your target market might be 20-40-year-old females with a college degree and at least one child under the age of 18. Once you know these details about your market, you can then research what’s important to them and where they struggle. This will help you find those longtail, low-hanging fruit keyword phrases.
The best way to perform market research is to hand off the task to a professional marketing agency. Although, you can perform your own market research if you’re up for the task.
Once you’ve completed extensive market research to find out more about your target market, you’ll want to dive into researching your competitors. Who are your competitors targeting? Are they targeting a different segment of your market? Where are they generating backlinks? What type of ads are they running in print and online?
Here are some ways to research and analyze your competition:
You don’t need to copy your competitors; you can’t know if their methods are effective. However, they might be utilizing a marketing tactic you haven’t tried.
There are several paid tools that are worth the money, but if you’re looking for a free tool, use the backlink checker from SEO.co. Just type in a competitor’s domain name and you’ll be given a substantial amount of free results.
Visit the pages where your competitors’ links are published and see if you might be able to get a link from the same source. If it’s a blog, you can probably talk to the webmaster or editor to see about submitting an article for publication.
If you notice the link is on a professional site, like for a business, pay attention to what content is linked. If it’s statistics, data, reports, or any other type of valuable data, you might be able to get a link from the same (or similar) source.
You’d need to create content similar to the data published by your competitor without copying their content. For example, say you’re in the food industry and your competitor published a white paper on the health benefits of eating potatoes. You could publish a white paper on the benefits of eating quinoa. The idea is that if your competitors are creating valuable resources that are getting linked, you’ll want to start creating those valuable resources, too.
Ranking high in the search engine results pages (SERPS) only matters if users click on your link. You could own the top spot for every keyword on the planet and if users don’t click, your rankings are meaningless.
Search engines like Google, Bing, and Yahoo display the title and description of each search result. This information helps users decide to click (or not). Writing an effective title and description will get you more clicks.
Ideally, you want each page description to accurately describe the page content as briefly as possible. However, describe the content in terms of what the user will get out of their visit.
For example, say you run a historical website documenting the history of China with a page about the 13 dynasties that existed prior to the shift in rulership. There are two ways you can write your webpage description:
“This page contains a list of all Chinese dynasties.”
“Learn about the 13 dynasties that ruled China from 2070 B.C. to 1912.”
The first option is informative and accurate, but it’s not engaging or enticing. The second option is still basic, but would generate more clicks in the SERPs.
Google truncates meta descriptions after 160 characters, so keep your descriptions short and to the point. However, make sure your descriptions are long enough to tell the reader what your content is about. Your meta description is an opportunity to generate clicks, so make it count.
If you want all of your web pages and files to get indexed in the search engines, you probably don’t need a robots.txt file. However, if you have content you’d prefer to stay out of the search engines, you can use a robots.txt file to instruct search engine spiders to ignore certain content.
Blocking content comes in handy when you have hundreds or thousands of files that don’t provide value on their own and would clutter up the SERPs if indexed.
If you’re using a content management system like WordPress, you already have a robots.txt file. If you don’t already have a robots.txt file, it’s easy to create one.
Using a robots.txt file to block search engines from crawling and indexing certain files and folders on your server is a request, not a guarantee. With that said, don’t use your robots.txt file to ask search engines to ignore folders containing private data. Search engines might not honor your request.
Secure sensitive data through encryption and passwords.
If you’re using WordPress, your robots.txt settings are automatically updated based on certain configurations made inside of your admin panel. For example, you can check a box that will request that search engines not index your website.
This setting is great when your site is in development and you don’t want anyone accessing incomplete content through a search. However, if you don’t uncheck the box when your site goes live, your site won’t get indexed and people won’t be able to find you.
An XML sitemap tells search engine crawlers what pages you think are most important and offers information about each page. For example, when the page was updated last, how frequently the content changes, and if there are other versions of the page in different languages.
After search bots crawl hyperlinks and previously discovered pages, the bots start crawling XML sitemaps. Having a sitemap increases the chances of getting more of your web pages indexed in the search engines. Any companies have experienced a sharp increase in traffic just by adding a sitemap.
Create an XML sitemap for search engines and submit that sitemap to each search engine through your webmaster tools/search consoles. If you already have a sitemap, use our free sitemap validation tool to make sure it’s optimized for search engines.
A detailed, organized sitemap made for visitors will help them find more of your content. A visitor sitemap can also help the search engines determine what your website is about in a broader context.
While there’s no direct correlation between creating a visitor sitemap and ranking higher in the SERPs, it will contribute to the overall success of your site.
You might be surprised to learn that some sites don’t get indexed because crawlers are blocked. Sometimes, entire domain names have been blacklisted from Google. Are your pages getting indexed? Is your site blocking crawlers? If you’re not getting much traffic from your SEO efforts, your site might be blocked.
Perform a quick site search in Google to see what pages have already been indexed. Type the following query into the search bar:
Any pages that don’t show up are not indexed. If no pages show up even after all three searches, you need to do a little more investigating. Semrush offers the best free site audit tool for this purpose.
There are four different versions of your website, all of which are treated as separate websites by search engines:
On the back end, you should have one version designated as your primary site and direct all other versions to that primary site.
If Google has indexed multiple versions of your site, it’s going to affect your rankings. For example, if you run a content marketing campaign with links to http://yourdomain.com, only that version of your site will get the “link juice.” If your primary site is actually http://www.yoursite.com, you’ll have to run a separate campaign to rank pages under that domain.
Do a site search in Google for all 4 domain formats listed above. If you get results for more than one domain format, talk to a website developer about designating a primary version and redirecting all others to that primary version of your domain.
Are search engine spiders able to crawl your site? Do they stop short of completing their task? You won’t know unless you look into the matter.
You can find crawl errors through Google Search Console. Some of the most common errors occur when spiders hit 404 errors or pages that haven’t been properly canonicalized.
Page depth refers to the number of clicks required to reach a given piece of content when that content isn’t directly linked in the main menu.
The deeper a search engine spider has to go to find your content, the more of its allocated “energy” it uses up. If visitors need to click through 3 or 4 pages just to get to a piece of content, that’s probably too deep for search spiders to reach without exhausting your site’s crawl budget.
This doesn’t mean you can’t create breadcrumb navigation that goes 10 links deep. Just don’t make a long chain of links the only way for visitors, including search spiders, to access your content.
After you’ve gone through your website to troubleshoot technical errors and create the right settings, it’s time to implement some technical SEO.
No matter how much content you already have on your website, you’ll need to spend time optimizing that content using your ‘money’ keywords. These are the keywords with a high search volume and low competition. Usually, they’re long tail keywords.
Create a strategy for optimizing your existing content. Make a list of all URLs you can optimize and check them off your list as you complete each one. Use your money keywords wherever you can naturally fit them into your content. If you can’t do it naturally, create a new paragraph or edit your content to allow for using those keywords.
Sometimes it’s hard to retrofit keywords into existing content. You won’t have that issue when creating new content. You can optimize new content from the start. You can even create articles and informative pages that center on your best keywords.
Search intent matters more than ranking because a user’s intent will determine their action once they arrive on your page. When the content delivered matches a user’s intent, you’re more likely to get a conversion. When the content delivered doesn’t match a user’s intent, that visitor is more likely to bounce.
You can get a web page to rank for a keyword, and generate clicks, but your content might not serve those users. In that case, you’ll get traffic, but no conversions.
For example, say you run a website that talks about atmospheric water generators and you rank high in the SERPs for related keywords. Now, say a user intends to buy an atmospheric water generator and clicks on your site. If you don’t sell AWG units, you’ve lost a conversion.
If something like this happens to you, the question to ask yourself is why would someone believe my site sells AWGs? It could be a misleading web page title or meta description, or it could just be the user’s inference. You can’t always pinpoint every user’s intention, but with product pages it’s much easier.
Ensuring that your content aligns with user intent is imperative to generating conversions from your SEO efforts.
There are two reasons to switch to HTTPS: better rankings and more security. HTTPS adds a layer of security to a website that encrypts information being submitted through forms, including email signup forms.
Since security is a top priority for Google, they added HTTPS as a favored ranking factor back in 2014. Today, it’s a pretty important ranking factor and is officially the standard in website security.
Multiple website owners have reported significant increases in rankings and lead generation just by switching to HTTPS.
The method for moving your site over to HTTPS will depend on how your site is set up. First, you’ll need to get an SSL certificate. You might already have one; a few years ago, several popular hosting companies started giving out free SSL certificates to clients.
If you don’t have an SSL certificate or you aren’t sure how to set one up, hire a developer. They’ll get it done quickly and efficiently. Although, once you have HTTPS set up, you’ll need to do some editing throughout your site.
Comb your site for http links and change them to https. If your site runs on a database-driven CMS like WordPress, you can do this easily with a plugin. For example, the best WordPress plugin for this function is called “Better Search Replace.” This plugin will allow you to search for text strings and replace them.
When all of your content is stored in your database, searching the database for instances of http:// will bring up all the instances of links using http. However, it will return results for external links as well. You don’t want to change your external links because those sites might not be using HTTPS.
If your site is huge and you have a lot of external links to sift through, use the search and replace function to look for strings that include part of your website address like this:
The difference is, searching for ‘http://’ will pull up instances of every external link on your site. If you switch those to https://, the links might not work. When you include your site’s URL, your search results will be narrowed down to links to your own domain.
If you’ve used relative links for your internal links, you may not get any results. In that case, you’d have to search for individual page names like “page.html” or “page.php” to find out where they’ve been linked.
Speed is important for SEO. Slow websites get deranked and frustrate users. Optimizing site speed is a critical SEO task, but there’s more than one type of speed you need to be concerned with.
Neil Patel’s team analyzed 143,827 URLs and found that “top-ranked websites have high site speeds in the ‘start rendering’ metric,” and corroborated existing research that “top-ranked websites have faster site speeds for page load time to first byte (TTFB).”
All of the elements on your page need to load quickly for your users, but you also need a fast TTFB. Time To First Byte (TTFB) is a metric that measures how long it takes a user to receive the first byte of information from a website.
Google recommends that a website’s TTFB should be 200ms or less.
The best way to optimize site speed is to run a free site speed test to find opportunities for improvement. If you’re not sure how to fix the speed issues that turn up in your test results, you can hire a web developer to fix it for you.
Missing pages, designated in browsers as 404 errors, can kill your rankings. It’s critical to identify and fix all 404 errors so that search engines don’t start to view your website as unreliable.
The easiest way to discover 404 errors is to use Google Webmaster Tools to check for crawl errors. You’ll get a report that details pages that have been requested that don’t exist. The listed URLs are links people clicked on or typed into their browser that don’t exist on the server.
If you’ve ever transferred your website to a new platform or moved or deleted content, you probably have some 404 errors. Moving to a new CMS is often the primary cause for 404 errors.
Let’s say you built your website using HTML and CSS and all of your pages are formatted as pagename.html. Say you spend a lot of time getting several pages to rank high in the SERPs.
Now say a few years later you hire someone to transfer your website over to WordPress. You can use the same names for your URLs, but on WordPress, URLs don’t end in .html. If you don’t create 301 permanent redirects for every single web page on your entire domain name, all of your links in the SERPs will be 404 errors.
Don’t let this happen to you! Find all 404 errors on your website and then fix the errors with 301 permanent redirects. Once your site is set up with redirects, create a system for documenting every page that gets deleted, renamed, or moved so that you can create a redirect immediately. Ideally, you should avoid renaming, deleting, or moving content to use as few redirects as possible. However, it’s okay when you don’t have any other options.
Does your website generate URLs full of numbers, letters, hashtags, and punctuation? If so, your URLs aren’t search engine friendly. Here’s the difference:
Search engine friendly URL:
Search engine friendly URLs make URLs readable for everyone. URL readability is important to both users and search engines. For users, a URL tells them what to expect when they arrive on the page. For search engines, a URL can provide clues about the page content and context.
Friendly URLs will get you more clicks in the search engines. Normally, your page URL is displayed right underneath the linked page title. If your URL contains random character strings, users will be less likely to click.
Google now hides the full URL from users. However, other search engines, including Bing, still show the full URL in the SERPs.
Unfriendly URLs can also affect your ability to get clicks from your content marketing efforts. If you published unfriendly URLs as backlinks, you’ll get some link juice, but you may not get much traffic from users who preview the URL before clicking.
Here’s a helpful guide that will teach you how to create search engine friendly URLs.
Structured data, also known as ‘schema markup’, is markup that tells a search engine about the content of a page. When search engines have this additional information, it makes your page appear in the SERPs for more relevant searches. Using structured data also helps your page benefit from rich snippets, carousels, and knowledge boxes displayed in search results.
To learn more about what types of structured data you can use on your pages, head over to schema.org.
Did you optimize your titles and meta descriptions when you published your content? Even if you started off with amazing titles and meta descriptions, it’s useful to revisit each page to see if you can strengthen your copy.
If you’ve found better money-making keywords since you first published your pages, you may want to include some of those keywords.
Alt text (alternative text) tags are attached to images for the purpose of informing users and search engines what the image is about. Alt text is also displayed on a page when your actual image can’t be displayed.
Alt text helps visually impaired users understand your content, so your alt descriptions should be as descriptive as possible without being too long.
If you’re creating your website using HTML and CSS, you’ll need to define your alt tags in the image markup. For example:
<img src=“/images/horse.jpg” alt=“black horse eating grass” />
In WordPress, make it a habit to set your alt text the moment you upload an image. If you haven’t been adding alt tags, go back through your media library and add alt tags to all of your images. If it seems like too much work, outsource the task to a freelance developer.
Adding internal links helps to pass “link juice” or “link power” throughout your website. It’s not an exact science, since search engines don’t divulge exactly how much power you get from internal links. However, internal links are important.
Go through each of your pages and blog posts to identify places you can put internal links. Here are some basic rules for internal links:
In addition to helping pass on link equity, internal links help search engines crawl and index more of your site.
In the early days, Google rankings were heavily influenced by keywords contained within heading tags. Today, putting keywords in your headings is not a ranking factor. However, headings tell the search engines what your content is about, so it’s still critical to optimize your headings.
Your site should already have a set of heading tags defined with CSS for H1 H2, H3, H4, H5, and H6. Use heading tags for your content headings rather than just styling headings with larger font sizes and bold text.
Creating and optimizing your Google My Business (GMB) listing is one of the most important technical SEO tasks you could perform. If you run a local business, whether you have an online presence or a brick-and-mortar store, you need a GMB listing.
In Google, GMB listing results are displayed prior to organic results. When your listing comes up in search results, you can catch people’s attention before they choose a competitor’s listing or keep scrolling down the page.
Many businesses have found it’s easier to rank their GMB listing for local search than it is to rank in the organic results for their industry. You might have major national competitors who dominate the keywords in the SERPs, but locally, you could dominate for those same keywords with your GMB listing.
If you have content that doesn’t rank and doesn’t provide value to your visitors, delete that content. Content that doesn’t rank or provide value isn’t worth keeping. You don’t want search engine spiders wasting their crawl budget on web pages that aren’t supporting your business.
Last, but not least, let’s dive into the mobile-friendly technical SEO checklist.
Not that long ago, Google searches from mobile and desktop devices returned different results. Today, Google returns results for every device from the mobile search results index. This is called “mobile-first indexing.”
It’s a pretty simple system. When Google has two versions of a website in the index, it will favor displaying the mobile-friendly result.
Rumor has it, Google might start to de-rank websites that aren’t mobile-friendly, although this has yet to be defined or even confirmed. However, it’s not hard to see where the future of SEO is going. There will come a time when sites will lose power if they aren’t easily accessed on a mobile device.
Use Google Search Console to discover mobile usability issues with your website. GSC will tell you if your text is too small, if clickable elements are too close together, and if there were any problems loading your pages.
Giving search engine spiders full access to your site makes it easier to determine if your site is mobile-friendly.
There are pop-ups… and there are intrusive pop-ups. For mobile users, most pop-ups are intrusive. This creates a dilemma, because despite their reputation for being annoying, pop-ups are extremely effective.
In 2017, Google rolled out an update that makes it harder for sites to rank when pop-ups are intrusive for mobile users. Google provided several examples of pop-ups that are considered intrusive. For the most part, intrusive pop-ups cover the main content or require a user to dismiss the pop-up notice.
Popups for age verification and cookie acceptance are exempted, but all other pop-ups should take up only a minimal amount of space on the screen.
To create a mobile-optimized website, you need to see what your site looks like on a variety of mobile devices.
Today, there are countless devices and operating systems, so you’ll need software to run your tests. However, you probably don’t need to test every single possible OS and device combination. You can use mobiReady to preview your site on four devices: a desktop, a high-tier phone, a mid-tier phone, and a low-tier phone.
When most users accessed the internet from desktop devices, it was standard practice to create a separate website for mobile users. Today, most people use a mobile device to access the internet. While you can still create separate websites that automatically detect the user’s device, that’s unnecessary.
Instead of having to update two versions of your website, you can use responsive design to make your site shrink or expand to fit any device’s resolution.
If you mostly use a desktop computer or laptop, you may have noticed that font sizes have gotten larger over time. This is intentional. Font sizes have gotten bigger to ensure readability for mobile users.
If your text is small, crank it up a few points and make it large enough to be easily read from a mobile device.
There are two reasons to get rid of flash: it’s not mobile-friendly on all devices and Flash has been officially retired and is no longer supported.
Google will de-rank sites that use flash as part of the mobile-first indexing update. Also, using Flash is now a huge security risk. Don’t use Flash on your site and uninstall Adobe Flash Player from all of your devices.
Have you heard of Accelerated Mobile Pages? AMP is a new web page format created by Google for creating websites that are “compelling, smooth, and load near instantaneously.” Using the AMP web page format makes pages load up to 4 times faster than even the most optimized, standard web pages.
Right now, there’s no indication that Google will rank AMP sites higher. However, as long as speed is a major ranking factor, AMP sites will at least get ranking benefits for being fast.
The other cool thing about AMP sites is that it can get you more clicks in the SERPs. In the SERPs, Google displays a lightning bolt icon next to sites using the AMP format.
AMP sounds like the ideal SEO solution in terms of speed, and for many businesses it’s a wonderful way to reach mobile visitors. However, AMP websites are severely limited in functionality. For example, you can’t use pop-ups or light boxes and you can only use CSS sparingly. This makes sense given that AMP pages are designed for speed.
The other problem is that all AMP sites are hosted on Google’s platform on https://google.com/amp/yoursite, which means your website won’t benefit when you build backlinks to your AMP website.
As with every new Google project, it’s worth testing out. There could be advantages unseen in this moment. However, don’t rely on AMP to get mega results or last forever.
The viewport tag is what changes the size of your web pages according to each user’s screen resolution. Here’s how to set the viewport according to the W3C.
Key takeaways on the viewport:
These are basic guidelines, but are also easy to forget about. If you’re not sure how to set the viewport or somebody else manages your site, connect with your developer to make sure your viewport is set properly.
Last, but not least, optimize your site for mobile speed. You can run a quick speed test on Think With Google, which will give you specific feedback regarding how fast your site loads specifically on mobile devices.
Other ways to reduce page load speed for mobile include:
While there are seemingly countless technical SEO tasks to perform, SEO boils down to three things: context, speed, and usability. When search engines understand the context of your site, you’re more likely to appear in the SERPs for relevant searches.
When your site loads quickly, you’re less likely to be deranked for sluggish page elements. Finally, when your site is usable, you’ll get more traffic, people will link to your site, and you’ll benefit from organically-acquired backlinks.
If you need to step up your SEO game, contact SEO.co and tell us about your goals. We’ll work with you to create an effective SEO strategy to help you get better rankings and retain your visitors long-term.