Loading...

@

Advertisements
How to Optimize Your Site for Search Engines -- Google
Tech
2 years ago

As search engines have grown smarter and more sophisticated, it has become even more important to have your site optimized for search. This makes for better crawling and faster indexing.

A quality content always wins, but being the last to get to the market means fewer traffic to your business. A faster indexing is what guarantees a higher market share before competitors come into play.

Having your site and content well optimized for search engines increases the visibilty of your content on search engines. That will mean your content is indexed faster, made available to users as soon as it is published. In turn you will get high click through rates and increase in customer conversion.

Before, webmasters and content publishers used to only worry about keywords. Unfortunately, searches based solely on keywords quickly gave rise to keyword stuffing. For example a content based on home appliances could be stuffed with keywords related to baseball searches.

That will mean people searching for their favourite baseball team or score outcome will instead land on content that is completely unrelated.

Search engines learnt from that and quickly moved away from algorithms based on keywords. Neververtheless, keywords are still very important only in the sense that they feature within texts on the page.

Create A Sitemap

Having an xml or rss sitemap for your site will ensure faster indexing. If you have a particularly large site, a sitemap tells search engines where to easily find your content.

While having pagination on your site increasing crawling and indexing, these might not be enough for a very large site. Search engines run on a cost budget and there is little time to go finding your valuable content if it is buried deep within your site.

Pulling up the content into a sitemap is equivalent to moving the content from page 1000 to page 10. This makes it easier for search engine bots to quickly land on page 10 and have the page appropriately ranked.

Although search engines can handle sitemaps with over 50,000 links, it is advisable to have smaller sitemaps to have quicker crawling and indexing.

A large sitemap would mean bots would spend several minutes or hours crawling all the links within the sitemap. If the bots leave without crawling all the links, when they do come back, they typically begin from the top meaning some pages never get crawled or indexed. This is particularly true for a site that loads slowly or has several broken links or redirects within the the sitemap.

Keep your sitemaps smaller and also ensure there are no broken links.

Having a compressed site map also helps. Google, bing and other search engines easily crawl compressed site maps. A compressed site map reduces the load time and bandwidth for your sitemap. This makes indexing faster and also reduces the bandwidth taken up by your sitemap especially for shared hosting sites that have limited bandwidths.

Ping new content

Again, this is a first to market rule. Pinging a new content when it is published serves as a direct and immediate call to search engines to come find new content. For a relatively small site or blog of 100-500 pages only, pinging might be an overkill. Large sites that often have content buried deep within definitely need pinging to get new content indexed immediately.

Meta Tags

Meta tags like page title, meta description and keywords used to be paramount in search. Search engines have grown a lot smarter now and rely on many ranking factors. Many search engines no longer use meta keywords in page rankings as many sites abused it. Quite often you would find meta keywords on a page where the content is completely unrelated to the keywords. This was used in the early days of search mainly to fool search engines and drive traffic to sites with often dubious intentions.

Meta titles and meta descriptions are still relevant although they have taken a much less prominet role. Page titles and descriptions are often displayed in search results. These metatags also tell search engines what the page is about. By using the meta title and description, search engines can tell how closely related the content is to the meta tags can rank the page accordingly. It is imperative to use distinct meta titles and descriptions on all your pages to have your site fully optimized for search engines. Repeated metatags will get flagged as duplicate content by Google.

Have your site optimized for mobile.

Google now uses mobile signals like font size, how close touch elements (links and buttons) are, configured viewport and how the content fits into the viewport. As for font size, if your font sizes are too small that make readability on mobile devices hard, Google flags that. Give enough spacing between touch elements like links and buttons so that a user does not touch on a nearby link or button instead of another. As for configuring viewports and sizing content to the viewport, make sure the width of your content fits into the width of the viewable screen so a user does not have to stretch the screen or scroll sideways to view the entire content. Make content scrolling solely up or down as that makes content easier to read.

If you are registered for Google webmaster, you can view whatever Google finds as poor user interactions or problems and fix them immediately. As of April 2015, Google began using mobile friendliness as a ranking tool. So, make sure you have your site optimized for mobile users.