Google Starts Indexing Facebook and Twitter in Realtime
Google will soon be indexing Facebook and Twitter comments and serving them up as part of the company’s standard search results. Google’s realtime search robots show private comments on private pages within Facebook, but now any time you use a Facebook comment form on a other sites, or a public page within Facebook, those comments will be indexed by Google.
Developers are upset because Google is no longer the passive crawler it once was and users will likely become upset once they realize that comments about drunken parties, embarrassing moments or what they thought were private details are going to start showing up next to their names in Google’s search results.
This is waht Google says about it:
” Given the incredibly fast pace at which information moves in today’s world, the most recent information can be from the last week, day or even minute, and depending on the search terms, the algorithm needs to be able to figure out if a result from a week ago about a TV show is recent, or if a result from a week ago about breaking news is too old.
We completed our Caffeine web indexing system last year, which allows us to crawl and index the web for fresh content quickly on an enormous scale. Building upon the momentum from Caffeine, today we’re making a significant improvement to our ranking algorithm that impacts roughly 35 percent of searches and better determines when to give you more up-to-date relevant results for these varying degrees of freshness.”
Google has confirmed the move. The head of Googles Webspam team, Matt Cutts, sent out a Tweet and confirmed the news on Twitter:
“Googlebot keeps getting smarter. Now has the ability to execute AJAX/JS to index some dynamic comments.
The news has spread like wildfire on the internet, and among internet marketers, and Google has been quoted as saying, “What you’re seeing is a result of Google increasingly being able to crawl JavaScript and AJAX content. We have steadily been increasing our ability to index richer content such as JavaScript/AJAX. If users can see something in their browser, our goal is to be able to index that content irrespective of its language or format.”
Now, commenting through third party engines will lead to SEO boost, as opposed to earlier when commenting engines like Disqus, Facebook or Intense Debate could not be crawled by the search bots. As a blog or website admin, these commenting platforms were quite convenient to enable commenting, and tackle spam. But they were useless for SEO purposes. However, now that Google bots can read AJAX and JavaScript, the comments and commentator’s name are finding place in search results.
We feel this will generate a new influx of “Spammy Postings” and result in of worthless spam psotings ?
Jush Justified is an expert Search Engine Marketing instructor and assists Local Search Marketing companies.