Humans and search engines see the world very differently. This difference of view points frequently limits marketers’ ability to understand why their site isn’t performing in organic search. Sometimes altering your perception of a site so you’re looking at it through search engines’ eyes can be enough to illuminate the issues. We call this “being the bot.”
What do search engines want? Relevant search results are their primary offering. Without relevant results, the search engines lose their human audience and their ability to monetize search.
Search engines determine relevance by crawling the Internet, indexing textual content and links and using sophisticated algorithms involving hundreds of elements to determine the topic, context, popularity and authority of each individual page.
They do this with the help of textual content and links. If the words and links are displayed on a page in plain text, and if that page has other pages linking to it, then search engines can “see” that page. Think of the Internet in the 20th century. Those long monotone pages of plain HTML text and links were perfect for search engines, but fantastically boring to humans.
Unfortunately, search engines can’t see some of things we use now to make the Internet exciting. Let’s start with images. Words in an image are not indexable; they do not exist in SEO terms. This is a critical point: Your human brain insists that the words you see in an image exist. However, “being the bot” requires you to retrain your brain to unsee those images and whatever lies inside them.
The Web Developer Toolbar plugin makes being the bot a lot easier by disabling images in the browser. While you’re in there, disable JavaScript, CSS and cookies as well. Search engines don’t traditionally use these technologies either, though some more advanced headless browser bots can.
Try to navigate a modern site with images, JS, CSS and cookies disabled. Can you understand what the page is about? Can you move around all areas of the site? If you can’t, it’s likely the search engines’ crawlers can’t either. If a search engine can’t crawl a site, it also can’t index, rank or drive traffic to the pages on that site. Next time you’re analyzing the SEO performance for a site or building a new site, keep the search engines in mind -- be the bot.
Business
Jill Kocher is a seasoned SEO professional and all-around technogeek. By day, she manages Resource Interactive’s SEO practice here in Chicago and serves as contributing editor at Practical eCommerce. By night, Jill landscapes her home in the far northern suburbs of Chicagoland while enjoying a glass of wine and thinking about SEO some more. Family discussion centers primarily around SEO, analytics, social media, mobile apps, android, iOS, how-was-your-day and cats.