Friday, March 25, 2011

Google corrects bug that removed Berkeley news site from search rankings

Google News is once again indexing the local news site Berkeleyside.com after dropping it on Saturday.

Google changed its algorithm on Feb. 24 to lower the search rankings of content farms and other low quality websites, but Berkeleyside shouldn’t have been hurt since it is run by three professional journalists, Lance Knobel, Frances Dinkelspiel and Tracey Taylor.

Berkeleyside tried to contact Google to find out what happened and correct the situation, but they weren’t able to reach anyone.

“There is no real way to communicate with Google so we are trying to spread the word. If this happens to us, it can happen to other hyper local sites,” Dinkelspiel wrote on Wednesday morning in an e-mail to other journalists with the subject line, “How Google can squeeze a hyper local news site.”

Soon several journalists covering social media and online journalism began tweeting the story, including Felix Salmon of Reuters, David Carr of the NY Times and Scott Rosenberg of Wordbugs, Dan Gillmor and Dave Winer.

The fusillade of tweets worked.

Wednesday afternoon, a Google engineer and Berkeleyside reader contacted Berkeleyside's Lance Knobel and told him that a bug in the system caused the problem. It wasn’t a deliberate snub by Google to hurt a local news site.

Knobel lists the lessons he learned from this experience on his davosnewbies.com blog. He points out it was “helpful to be producing a vital news source in the Bay Area, where we’re almost bound to have readers who work at Google (and just about any other tech giant you can think of). If we were somewhere else in the world, perhaps we could have gathered the same forces in support, but I think the direct connection to at least one Googler who looks to us for news helped.”

2 comments:

Robert B. Livingston said...

Google has become too big for its own good or ours.

Trying to adjust my Google News Page recently, I was amused by this discovery:

http://soulpowered.tumblr.com/tagged/Google_News

Anonymous said...

This is garbage.

Journalists should not have the right to circumvent Google policy. If you have a problematic site map, or are hosting duplicate content without canonical urls, or even experience server downtime when Google crawls your index, that's your own fault, and could result in Google dropping you entirely.

Google does provide ways you can resubmit your site to be indexed. They could have resolved the issue like normal webmasters rather than resorting to the Henny Penny arms-a-flapping style of sensational journalism.