Solving Google Cache Issue

After Google Penguin 6 ( aka Penguin 3.0) update, some webmasters asked me why Google does not cache their updated pages. Before talking about Google Cache issue let’s have an overview on the differences between Google index and Google Cache.

An “indexed” webpage is a site that has been crawled by a search engine spider and filed away in their index for later use, and a “cached” page is one that may show up in search results.

Submitting pages listed in the SERPs is a multi-step task and process. First, the crawlers need to access your website and scan the page. It depends on if they crawl, how they crawl and how deep they crawl. I explain later in this article. The next step is to index the information they find within your page. If they decide to index the page, it will show up on search result based on the keyword and phrase on your website. You must add your keywords and key phrases on the page at the right frequency for spiders will be able to  crawl it. This has something to do with applying the right on page SEO optimization.

Indexing your page does not mean they show up on search results. When the search engines decide that your site’s information might actually be of use to users, they will take a “snapshot” of the page and store the file away in their index of cached web pages. This is a completely different index, or more accurately, a subset of the original index, these cached pages qualify for inclusion in the SERPs, and now your potential customers may be able to find you.

Google Cache is a significant tool in exploring exactly how Google’s bots see the page. Google crawlers or spiders cannot see the content of your website. They receive signals as to where to find content of your site in form of text, video, sound and images.

There are a lot of applications and scripts on the internet and Google’s spiders are not compatible with these scripts. This will cause a conflict between the scripts within your websites and Google’s bots.

There are certain types of code that Google spiders are incapable of indexing. The crawler essentially views a blank space. If you’re counting on some objects on your website to convey keywords and relevant content, you may run into trouble. This is due the fact that crawlers cannot see them. If you embed flash, iFrame, Ajax generated content on the site, Google’s crawlers have trouble indexing them.

Google crawler or spider can typically read flash files and extract the text and links in them, but cannot index the structure of it. even if your Flash content is in Google’s search index, it might be missing some text, content, or links.

Google often crawl websites which are updated very often. If you do not update the site, it takes up to one month for Google to crawl it.
Web sites or web pages which are not accessible remain in the index for up to a month. This is done to prevent a temporary outage from impacting a web site’s position in the Search index.

I received some questions from webmasters who have been concerned about Google not caching their websites as frequently as it used to. Some of them even monitored their competitors’ sites in order to find out if they are cached lately.
If your page has not been banned or suspended by either Google’s manual action or automatic algorithm glitch, then do the following:
Users Who Are Using WordPress Platform:

There are many ways to get the page cached quickly. I recommend installing “wordpress super cache plugin“.

Sometimes this plugin may get broken due to having a conflict with scripts within your wordpress template. You can try different cache plugins, until you find the right one for your page. This is done by testing the page with the plugin for two days.

I strongly suggest website owners to delete deactivated plugins under their wordpress. If your plugin is outdated or deactivated, your website will have security issue and become vulnerable.

HTML websites

If you run an ordinary HTML website, you can ping the updated page at here for indexing( make sure the page is optimized properly):

https://www.google.com/webmasters/tools/submit-url

This tool is more powerful than 30 bookmarking or drip feeds.  You are directly knocking at Google’s door for getting the page crawled and indexed.  So no matter what type of site you have, use this method. Then wait between one week to 10 days, if you still do not see any solution then your website may have one of the following issues:

– Your website or page is penalized
– Conflict between scripts within the site and Google or any other search engine crawlers
– There is a bug in the html code which prevents crawlers from crawling.
– .htaccess issue
– Robot.txt issue

At here you can check the cached site.

http://www.thepcmanwebsite.com/google_page_cache_checker.shtml

http://webcache.googleusercontent.com/search?q=cache:yourwebsitehere

 

Spread the word! If you have any questions, let me know here