• Free Automated SEO Site Audit

  • Form Example
  • The results from your free SEO audit will take 30-60 seconds to complete. Please be patient during the scanning process.

    Please visit our free backlink checker to get a more granular view of your site’s backlinks. 

  • Search Engine Indexing vs. Search Engine Ranking: What’s the Difference?

    Site Indexed, But Not Ranking? How to Get Indexed Pages to Rank

    Are you trying to rank your website in the search engines, but you have yet to see results? Perhaps you’ve noticed that plenty of your web pages have been indexed by Google, Bing, and Yahoo!, but you’re not sure why your pages aren’t turning up in search results?

    Shouldn’t indexed pages come up in Google search results?

    Yes and no.

    While web pages must be indexed to come up in search results, indexing doesn’t determine a web page’s ranking. Getting your web pages indexed simply means a search engine knows your pages exist.

    If you want your indexed pages to be delivered in search results, your individual pages need to be selected by the search engine algorithms. To get search engines to select your pages, each page needs to be given a high ranking. To get that high ranking you need a search engine optimization strategy.

    If you’re getting indexed, but not ranked, you’re not alone. Whether you’re just starting to rank a new site or you’re trying to improve an existing site, getting indexed is just the first step.

    How do search engine indexing and ranking differ?

    In a nutshell, search engine indexing is like creating a catalog of options. Search engine rankings is prioritizing and categorizing all available options in the catalog to ensure the best options show up in a search.

    A brief overview of how search engines work

    Search Engine Crawling, Indexing and Ranking

    Search engines function on three basic actions: crawling, indexing, and ranking. All three of these actions work together to deliver results to users who search for keywords and phrases.

    1. Crawling. Search engines use robots called “crawlers” or “spiders” to search the internet for new content in the form of web pages or files. Google’s crawler happens to be named Googlebot.

    Search engine crawlers can read code. In fact, you can place commands in your robots.txt file and webpages to give directions to search engine spiders. For example, you can ask search engines not to index certain pieces of content or directories.

    2. Indexing. Once a web crawler search engines discover a web page, it’s filed away to be delivered later during a search that the search engine considers relevant to the content of the page.

    3. Ranking. Complex, proprietary search engine algorithms look at each indexed web page and evaluate the value based on hundreds of search ranking factors that include on-page and off-page factors. Pages considered to hold more value get ranked higher and thus come up higher in search results.

    In addition to crawling indexing and ranking web pages, search engines also administer penalties to individual pages and entire domain names under a select set of circumstances.

    Search engines have terms of use and if any page or domain violates those terms they can (and often are) removed from the search engine index or given penalties that suppress them from showing up in the search engines.

    Why is search engine indexing important?

    Simply put, without being indexed, a web page won’t show up in search results. Indexing is the foundation for search engine optimization.

    Why is search engine rankings important?

    Search engine rankings is what determines where and how often your web pages show up in search results. The higher your ranking, the more chance you have at getting your pages on the first page of results.

    Granted, search personalization ensures that each person will get a different set of results for the same search terms, so your pages may not rank for everyone. However, your pages will generally be delivered when a search engine’s algorithm believes that your content is highly relevant to a user’s search.

    Ranking is important for several reasons:

    • Ranking can get you traffic
    • Traffic can get you sales, leads, and brand visibility
    • Ranking can help you dominate the search engines for certain keywords and phrases, which increases traffic even more

    An overly simplified explanation is that when your site shows up in search results, you’ll get traffic. When your site ranks in the search engine results pages (SERPS) for certain keywords and phrases, you’ll drive targeted traffic to your website from users who click on your links.

    Once those users visit your website they might sign up for your email newsletter, buy a product or service, read your blog, or bookmark your site to return. Having pages that rank higher in the results pages will increase your flow of organic traffic.

    The more search terms you can rank for, the more traffic you can get.

    All in all, your online business relies on search engine rankings for success. You can certainly generate traffic from content marketing, but you won’t get anywhere near as much traffic as you can generate by ranking high in the SERPs. Although, content marketing, when used for link building, is actually part of ranking high in the SERPs. It’s all interconnected.

    Ranking and indexing aren’t guaranteed

    Just because your web pages exist doesn’t mean they’ll be indexed. Likewise, just because your web pages are indexed doesn’t mean they’ll rank. While you can submit your site to get indexed quickly, it takes a strong SEO strategies over time to get your pages to rank.

    Rankings come and go. They are always in a state of flux, especially for new websites. That’s one of the reasons it can take so long to rank in Google.

    Some aspects of ranking pages are out of your control

    There are countless factors that affect indexing and ranking and only a handful are in your control. For example, redirects can prevent Google from indexing your pages. If Googlebot has to traverse through multiple redirects it’s considered a “redirect chain.” You can’t change the way Googlebot views redirect chains.

    Redirect chains are often created unintentionally. For example, you might redirect yoursite.com/1.html to yoursite.com/2.html.

    A year later, you might decide to redirect yoursite.com/2.html to yoursite.com/3.html.

    This creates two redirects where one is unnecessary; the best solution is to cut out the middle redirect.

    4 common issues with search engine indexing

    Although search engine indexing seems like a basic task, it’s not always smooth. There are several things that can go wrong, which can prevent your web pages from getting indexed.

    1. Robots.txt file commands can prevent crawling and indexing

    Understanding Robots.txt File Structure

    Robots.txt errors can prevent entire or other websites from getting indexed in the search engines.

    When Googlebot finds a new website to crawl, it looks for a robots.txt file first. Generally, Googlebot will abide by the requests in the robots.txt file and crawl the site accordingly. However, when Googlebot can’t access a website’s robots.txt file and is unable to determine whether a robots.txt file exists, it will skip crawling that website entirely. If your site doesn’t get crawled it won’t get indexed.

    Another reason robots.txt can prevent indexing is by command. You can place code in your robots.txt file to request that web search engine crawlers skip crawling your entire site or parts of your site.

    The solution:

    Perform a site search in each main search engine to see if your website has been indexed. In the URL bar, type in ‘site:yoursite.com’ (remove the quotes and replace ‘yoursite.com’ with your website address). You should see a list of all your web pages that have been indexed in that particular search engine. If you don’t see any results, you know there’s a problem.

    If you’re not coming up in search results, check your robots.txt file to see if it’s correct and recreate the entire file if necessary.

    If you can’t resolve your indexing issue, get a free SEO site audit to diagnose the problem and get your site indexed right away.

    2. Pages can get de-indexed

    Just because a web page gets indexed in a search engine doesn’t mean it’s going to stay indexed forever. Search engines are constantly reviewing web pages to make sure they still meet the terms of use as well as content quality requirements.

    There are many reasons web pages get de-indexed:

    • The page collected too many poor-quality backlinks. Your pages will get removed from the index if they’ve got excessive backlinks from link farms or spamm other websites already blacklisted by the search engine. You can actually rank better without backlinks than you can with toxic backlinks.
    • The page doesn’t meet Google’s quality content requirements. Google has a very specific set of guidelines for quality, although the exact formulas remain proprietary.
    • The page doesn’t have much content. If you got a page to rank using black hat SEO tricks, don’t expect it to last long.
    • The page has too many redirects.
    • The page turns up a 404 error. Search engines don’t want to give users results that no longer exist. This is why you should check your site for 404 errors regularly so you can fix them before your content gets de-indexed.
    • The page uses cloaking.
    • Google is required to remove the page by law. Whether Google believes it’s required to remove a page or there is a court order to remove a page, Google doesn’t mess around. Sites promoting illegal content are frequently removed from the index. Other search engines should follow suit, but some don’t enforce their rules as quickly as Google.
    • Google doesn’t think your content changes when users click on various navigation links. Google follows all the links it can when crawling a site. If it reaches a page and doesn’t think your links go anywhere, it will de-index the URLs your links are pointing to. This makes sense, but sometimes Googlebot doesn’t read links correctly.

    For example, Google de-indexed 10.5 million pages from Hubspot.com for this very reason. When Googlebot crawled the www.hubspot.com/agencies URL, it found what it thought were millions of links that didn’t go anywhere. Except, the links did go to various categories and pages dedicated to people offering their services.

    There really weren’t 10.5 million links de-indexed; it was more like 262,500 pages. There were other errors that caused the number to become inflated. However, 262,500 pages is still a hefty number of pages to get de-indexed. Thankfully, Hubspot was able to resolve the issue.

    The solution:

    Only publish high-quality, relevant content. It’s the only way to ensure you don’t get de-indexed for violating Google’s content quality guidelines.

    Unfortunately, you can’t prevent Googlebot from making mistakes while crawling and indexing your site. However, if you stick to producing the kind of content people want to devour, and build a legitimate backlink profile, you should be okay.

    3. Domain names can get blacklisted

    If you’re not ranking in the search engines, your domain name might be blacklisted. Domain names get blacklisted when they publish prohibited content or use shady SEO tactics to game the system.

    This problem can come up when you buy a domain from someone else. Any ‘used’ domain name has the potential to have been abused in the past by spammers or hackers.

    Examples of content that get domains blacklisted:

    • Mirror websites. A mirror website is a site that hosts identical content to another website on a different domain name or URL. Back in the day, many people mirrored content to make it more accessible. However, it’s unnecessary and search engines view it as duplicate content.
    • Gateway pages. Also called ‘doorway pages,’ these pages don’t offer much content. They’re created to rank in the search engines and navigation is usually hidden from visitors.
    • Pages with invisible link, images, and copy. Invisible content is a fast way to get blacklisted. Excessive and often irrelevant keywords are the most common type of invisible content. Although, sometimes people publish invisible images and links.
    • Content with keyword stuffing. Keyword stuffing is putting too many keywords into your content. Years ago, search engines ranked sites with stuffed keywords. Today, it’s a defunct strategy and heavily penalized.
    • Cloaking. This is a deceptive practice that shows one version of a page to search engines and another version to visitors in order to game the system. This practice is not only ineffective, but it’s a quick way to get blacklisted.
    • Content hosted with an unreliable web host. Cheap web hosts often have excessive downtime and take web pages down for exceeding bandwidth allotments. With hosting at less than $10 per month from sites like HostGator and BlueHost, there’s no reason to use a free host.

    The solution:

    1. Verify that your web hosting IP address hasn’t been blacklisted. You might be using a hosting account that provides customers with a shared IP. If your IP has been blacklisted because of another customer, request a new IP or change hosts.
    2. Check the history of all ‘used’ domain names before you buy them; it might be blacklisted. You don’t want to pay $5,000 for a domain name only to find out it’s been blacklisted forever by Google. If you’re registering a new domain, you don’t need to worry.
    3. Try your best to come up with a new, original domain name to avoid taking on someone else’s baggage.

    4. WordPress or other CMS settings can block crawlers (discuss WordPress checkbox that might forget to uncheck when you go live)

    WordPress Settings - Blocking Google to index the site

    When you’re developing a new website it’s normal to want to block search engines from indexing your pages as they’re being developed. You don’t want visitors to come to your site and see something in production. Most content management systems have a simple setting that allows you to block search engine spiders. This is supposed to be used temporarily, but it’s often forgotten even after launch.

    In WordPress, you can discourage search engines from indexing your site with a simple checkbox. It’s wise to enable this during development, but once you go live, it will be detrimental.

    If you’re using WordPress and your site isn’t getting indexed, it might be because you have this setting checked.

    The solution:

    Check your WordPress settings under Setting > Reading: Visibility. Locate the option to discourage search engines from indexing your site and if it’s checked, uncheck the box.

    To avoid this problem with future websites, create a list of tasks to perform right after your site launches live. You’ll probably have a handful of tasks to perform already. Add to the list, “uncheck box to discourage search engines in WordPress.”

    8 common issues with search engine rankings

    Sometimes search engine rankings doesn’t go the way you plan. Although it’s a science, we don’t know everything about search engine algorithms. We know enough to get big results, but there are still many unknowns.

    Here’s what we know about the most common problems that prevent pages from ranking.

    1. Your targeted keywords are too competitive

    When you first start working on search engine optimization, you hear people talk about “competing for keywords.” Any website in a given niche is competing with other websites for keywords.

    On the surface, competing for keywords sounds like a battle you can win with the right strategy. It almost sounds like a challenge. For example, if you’re running a local mail center, with the right strategy you can outrank your competitor down the street.

    Outranking your competitors is easy when you’re on the same playing field. However, highly competitive keywords are harder to rank for because the competition consists of giant corporations with multi-million-dollar marketing budgets. If you don’t have that level of budget, you can’t compete.

    For example, say you build the most amazing website about cars. Say the design is perfect and your content is far superior to anything else out there written about cars. Unless you are a one-person expert marketing team who doesn’t need to pay for marketing services, you’ll probably never rank for keywords like ‘cars,’ ‘trucks,’ ‘vans,’ or any other generic phrase variations. Those keywords are essentially owned by large corporations that can outbuy any SEO strategy you employ.

    The solution:

    Narrow down your market and target long tail keywords. For example, if you’re trying to rank a news blog, you’ve got immense competition from professional networks and well-established news sites. The only way you’re going to rank is by creating a niche for your news blog and then targeting long-tail keyword related to that niche.

    For example, you might want to target conservatives, democrats, college students, people who have no party affiliation, or people who are specifically fed up with mainstream media. Maybe you can target college student democrats who are fed up with mainstream media. The point is to narrow down your audience from ‘everyone’ to a very specific ‘somebody’ and then target the appropriate keywords and phrases.

    This brings up the next point.

    2. You’re not taking advantage of long tail keywords

    Long tail keywords aren’t just for people who can’t compete with large marketing budgets. Everyone should be targeting long tail keywords. Long tail keywords give you the opportunity to reach people searching for specific things, most importantly, people looking to buy your products.

    Say you run a business selling custom printed t-shirts. You’re never going to rank for the keyword ‘t-shirts,’ and that’s okay. Your money-making keywords are phrases like, “DIY custom screen-printed tees,” “custom screen-printed shirts sustainable ink,” and similar phrases based on what you actually sell.

    The solution:

    The key with long tail keywords is specificity. You want to target keywords that are highly specific, yet still generate enough searches per month to make it worth targeting.

    To discover your best long tail keywords, you’ll need access to a keyword research & SEO tools, like the one from SEM Rush. Or you can hire a professional marketing agency to perform keyword research for you.

    3. Your content is low-quality

    It’s not always obvious when your content is low-quality. You might have content you believe is high-quality, but in reality, the search engines see it as poor.

    High-quality content is marked by:

    LSI keywords used appropriately in your copy. Latent Semantic Indexing (LSI) keywords are those keywords that are related to your main keywords. For example, if your main products are baking accessories, your LSI keywords will be things like eggs, flour, cake mixes, recipes, sugar, and the names of famous bakers and chefs.

    LSI keywords are also synonyms for your main keywords. For example, ‘doctor’ and ‘physician’ are LSI keywords as are ‘lawyer’ and ‘attorney.’ Synonyms will come up in search results even when the user searchers for the other word. For example, when a user searches for ‘doctors near me,’ they will be given results that contain the word ‘physician’ even if the word ‘doctor’ isn’t used on the page.

    Thorough, well-written copy. Search engines have ways to distinguish thorough, well-written copy from words jammed into a web page. Algorithms can detect whether a piece of content flows with proper grammar and isn’t just repeating nonsense.

    There is debate about whether you should publish long or short copy. Both have their place. However, long copy tends to outrank short copy likely because quality long content discusses a topic in-depth and search engines can identify that depth.

    Unique content. Your titles and content should be unique. While you can legally copy other people’s titles (titles cannot be copyrighted unless they contain a trademark), search engines aren’t going to rank your content if it thinks it’s duplicate content.

    Your pages contain illegal elements. Elements like excessive redirects, cloaking, doorways, spam, links to spam, keyword stuffings, etc. are all prohibited in the search engines.

    The solution:

    Stick to one main topic per content piece. Avoid discussing multiple unrelated topics in one piece of content just to try to rank. If you manage to trick the search engines into ranking your content for keywords unrelated to your actual website, you’ll only hurt yourself because your bounce rate will skyrocket and your conversion rate will plummet.

    You can (and should) discuss related subjects, but only if they’re relevant to the angle of the piece you’re creating.

    Create unique titles for your articles and web pages. Spend some time crafting unique and compelling titles for your articles and your web pages. Use these 51 headline formulas to get some good ideas rolling. However, avoid using clickbait titles that grab attention and then disappoint the reader.

    Create content for your visitors first. Don’t create content just in an attempt to rank. Always create valuable content for your visitors first and then optimize that content for search engines.

    4. Your content writer plagiarized someone else’s content

    copyscape

    Duplicate content is often plagiarized or “scraped” content. This presents a double whammy of a problem for ranking your website. Duplicate content won’t rank and plagiarized content is both illegal and against search engine terms of use.

    If your content writers are plagiarizing other people’s work, those pages from your site won’t rank.

    The solution:

    You can’t control what your content writers do, but you can run their work through Copyscape to make sure the content you’re getting isn’t plagiarized. If you can, run all copy through Copyscape before paying your writers unless you work with writers you can trust on a regular basis. However, you should still run all copy through Copyscape. Just don’t pay your freelance writers until you know their work is original.

    The best solution is to hire a professional content marketing agency to write all of your content so you don’t have to worry about getting plagiarized content.

    5. Your on-page SEO is lacking

    On-page factors search engines look for include quality content, but also factors like:

    • Linkable content. If your page is password-protected or otherwise not linkable, it won’t rank.
    • The presence of title tags. Title tags are part of your HTML code that tell people and search engines what your page is about. Titles are displayed in search engine results and at the top of the browser window.
    • Specific URL structures. Your URL structure should tell search engines what your content is about. For example, a good URL structure looks like this:

    https://www.yoursite.com/videos/humor/funniest-cats

    With the above URL structure, search engines will know what your content is about.

    A poor URL structure looks like this:

    https://www.yoursite.com/videos/ff1903877

    With the above URL structure, search engines will have no clue what “ff1903877” means and it will not help your site rank.

    The solution:

    Make sure all of your content is linkable and accessible if you want it indexed in the search engines. It’s okay to exclude login pages and you can do that by telling the search bots to ignore those pages inside your robots.txt file.

    Optimize all of your title tags to accurately reflect the content of each page. Make your titles as short and descriptive as possible.

    Finally, create a proper URL structure for your files, folders, and pages so that search engines understand the relevance.

    6. Your content isn’t mobile-friendly

    All content must be mobile-friendly to rank. Now that Google has launched their “mobile-first indexing,” mobile-optimized content is always prioritized. Google will always serve a website’s mobile-friendly pages first.

    Web pages that don’t have mobile-friendly structures and content just won’t rank as well.

    The solution:

    Optimize your website for mobile devices. Make sure your content looks good and is fully functional on all mobile devices and operating systems.

    7. You’re blocking search engine spider’s

    If you’re blocking search engine spider’s, like Googlebot, your site won’t get indexed and that means it can’t possibly rank.

    The solution:

    If you’re struggling to rank certain pages, first determine whether or not your site is indexed by performing a site search. If your pages aren’t listed in the results, they’re not indexed.

    Find out why your pages aren’t indexed. First, check your robots.txt file to make sure you’re not blocking crawlers. If you’re not blocking crawlers, submit your site to the major search engines manually. Ask Google to crawl your site using the URL Inspection tool.

    8. Your backlink profile is poor quality or non-existent

    It’s not impossible to rank in some niches without backlinks, but if you’re in a competitive industry you need a strong backlink profile. Backlinks will give you power and authority, but only when you get the right links.

    The solution:

    Start generating high-quality backlinks from as many authority sites as possible. If you’re not sure how to get started, we can help. You can start by checking out our free backlinks checker here.

    We can get you indexed and ranked in the search engines

    If you want to get indexed and ranked in the search engines, we can help. Get in touch with our team of SEO experts and we’ll help you gain traction in the major search engines.

    First, we’ll perform a free SEO audit to find out where you are and how your site can improve.

    Next, we’ll create and execute a customized, professional SEO strategies to give your site a boost in the search engines.

    We have worked with Fortune 500 enterprises, SMBs, law firms & attorneys and everyone in between.

    Contact us today and let’s talk!

     

    Chief Marketing Officer at SEO Company
    In his 9+ years as a digital marketer, Sam has worked with countless small businesses and enterprise Fortune 500 companies and organizations including NASDAQ OMX, eBay, Duncan Hines, Drew Barrymore, Washington, DC based law firm Price Benowitz LLP and human rights organization Amnesty International. As a technical SEO strategist, Sam leads all paid and organic operations teams for client SEO services, link building services and white label SEO partnerships. He is a recurring speaker at the Search Marketing Expo conference series and a TEDx Talker. Today he works directly with high-end clients across all verticals to maximize on and off-site SEO ROI through content marketing and link building. Connect with Sam on Linkedin.
    Samuel Edwards