Digital Marketing 101: How Search Engines Work

Sep 09 2016
When you want information, you're probably like me and millions of others who turn to our favorite search engines to uncover answers. Have you ever asked yourself what happens when you type your query and press the enter key? How does the search engine display thousands of websites that are related to the words you typed? If you're a content strategist, digital marketer or just wanting to understand the mechanics of web optimization, the best place to start is understanding how search engines work. Knowing this is critical to your web marketing initiatives and campaigns. You can use search queries (or "keywords") to your advantage to help draw more qualified web visitors to your site and customize campaigns to work with search engines rather than against them. Receiving information from the web starts with search. There are over 1 billion websites on the internet right now? In fact, at the moment I'm writing this sentence, there are approximately 1, 077,746,972 websites in existence and by the time you read this, there's going to be a lot more. So how do websites account for all these sites including their pages and content? 

Searching & Crawling

Search engines perform two major functions: crawling and indexing. These two actions together, create the foundation of your search experience. They're how search engines gather information that eventually ends up on the Search Engine Results Page (SERP) you see after pressing the enter key in a search field. Only publicly available pages are crawled.   

Search Engine Crawlers by Name:

  • Google:  Googlebot
  • Bing:  Bingbot
  • Yahoo:  Slurp
  • AOL:  interesting fact, as of January 2016, AOL's search is powered by Bing. Previously it used Google. 
Crawlers find domains and sites then "crawl" pages by going from link to link (page to page) cataloging or "indexing" all kinds of information. Each search engine crawler indexes at different times and rates and they all tend to pay particular attention to new sites and changes to existing ones.

How search engines crawl websites

Think of search engines as tourists in a foreign country with nothing but a road map to explore the area. The map is the only reliable source of information about attractions in the country. The Internet functions as the map, and websites are like the attractions. Search engines send crawlers through every website online. Crawlers are small programs that identify and report web page components.  They gather data about the websites and index the information under separate categories and keywords, creating a list of sites that meet the query's criteria. 

How websites indexed 

Search engines use their own, proprietary algorithms to index sites which; it's the reason you get different results across different search platforms. While some engines prefer sites with a lot of backlinks, others prefer older domains, and some prioritize social linking and related activities.  Search engines regularly change their algorithms to keep up with searchers' demands and user feedback to provide better results than their competitors. PROTIP:  It's important to allow - even help - search engines easily crawl your website. It can mean the difference between higher and lower SERP rankings. Site structure will be key and luckily, search engines like Google and Bing provide site owners a few options to help them crawl their site(s). There are two professional methods to help search engines crawl your website: sitemaps and a file called "robots.txt." A sitemap is a list of available pages crawlers love to use. They enable search engines to easily index pages that point back to a site so searchers can find them. Sitemaps make it easier to crawl websites and pages, so they need to be search engine-friendly. The second method to help search engines crawl your site is using a file Robots.txt. This document promotes stronger search engine optimization opportunities and lets site owners give instructions to search engine crawlers. They can direct how process individual pages or even notify Google and Bing which pages should not be crawled (and indexed). 

Search Engine Algorithms 

"You want the answer, not trillions of webpages. Algorithms are computer programs that look for clues to give you back exactly what you want."  Google
Search engines like Google, Bing, Yahoo, etc., use different algorithms to help determine which sites are the best matches for a search query. Some measure a site's popularity, the amount of time people spend on a site, or content quality and site authority. Each of these factors gives search engines a full-spectrum view into how valuable your website content is and how relevant it is to a user's search query. Google has a few, well-known algorithms with quirky names like Google Panda, Hummingbird, Pidgeon, Pirate, Payday, TopHeavy and more. Each algorithm serves a particular purpose. Some algorithms like Panda and Hummingbird focus on content quality while others like PayDay concentrates on cleaning up search results for traditionally "spammy" searches (think, "payday loan," "credit loan" and other spammed keywords). TopHeavy is an algorithm introduced in 2012 that targets websites with way too many ads above the fold (ATF). Several other factors go into these algorithms, and search engine technicians and engineers are always updating and tweaking them to improve the quality of the results. If you're interested in Google's Algorithm change history, Moz SEO does an excellent job tracking Google's algorithm updates by year including any new recent ones. Most search engines follow the general search parameters of the most popular search engine, Google, but it's important to be aware each engines algorithms. 

Improving Search Rankings

Search engines value websites that have the features their algorithms prioritize. Sites with strong backlink quality, social interactions, and content quality rank well on SERPs. "Quality" can be vague so here are a few quality attributes content-based algorithms look for: 
  • Original, high-quality, useful and compelling content (copy, visuals, media, documents/files, etc.)
  • At least 500 words and contains (intelligently used) keywords or phrases
  • Written for human readers first, not for search engines (over-optimized and keyword stuffed)
  • Trendy, newsworthy and highly informative 
  • Human interest stories that inform or teach
  • Press releases that encompass the above 
PROTIP 1:  Over-optimized pages - those that are designed specifically for rankings and not for readers - actually have problems ranking well, so you need to write the majority of your content for people and not for Google, Bing or Yahoo. Additionally, search engines want to return the most current and up-to-date information to users, so their algorithms look for updated pages with fresh content. PROTIP 2:  Google's Panda and Penguin updates changed search engine optimization (SEO) by adding social signals to their algorithm. The more social activity, the better. Supporting information with social links (Facebook, Twitter, LinkedIn, etc.) can help a page rank higher.

Wrap Up

The key to high rankings is to give the search engines exactly what they want. Include compelling content that captures readers' attention so they'll stay on your site. Make sure that the content you present is peppered with social linking so that people can share it. Focus on fresh, newsworthy and trendy information, and you'll be well on your way to higher rankings.
Learn from us
Sign up and receive our monthly insights directly in your inbox!

Subcribe to newsletter (no spam)

Fields