Wednesday, February 01, 2006

Paris is burning Google in papier-mache effigy

A Paris-based group representing 18,000 newspapers called the World Association of Newspapers has told Reuters and that it is considering its legal options against Google for what it calls "basic theft" of content on Google News. Reaction in the blogosphere runs the usual gamut from incredulous to dismissive, summed up by Dan Gillmor who witnessed the WAN's attitude first hand last year when he co-keynoted with GN creator Krishna Bharat at the WAN annual meeting.

The newspaper people are mistaken. Google does create disintermediation, but it also sends traffic. More fundamentally, it uses the Web as designed.

That may be true, and you'd expect me to be on Google's side on this one since Tinfinger is also in the news aggregation business to some extent. However, the issue is more complex than that.

Take, for example, the way GN used to handle wire feed copy. The same story from Agence France Presse or Reuters or Associated Press or wherever used to appear on Google News many, many times, corresponding with the thousands of media outlets which licensed feeds to shovel into their Web hole. I know this because I used to scroll through pages and pages of results for the same story at different sites, and it really used to piss me off. Finally the Google guys got that that was a bad customer outcome, and now you only see that story once. Ah, but which lucky wire feed purchaser gets to appear in Google's index and get all the lovely clickthroughs for that one big story? Is it randomised? Is it whoever put the story up first? Oh, but doesn't that just mean whichever site the Googlebots hit most often? Maybe it's a nice little earner for the big G, where sites can pay a little extra something-something for the premium service, hmmm, know what I mean?

The news aggregation business will throw up a number of these issues which, while they may not seem like a big deal, do warrant serious consideration. While the quotes about Google being a thief are obviously stupid and meant only for the benefit of stoking up the story, there is a definite issue to be resolved about how to deal legally with the new class of news arbitrage services which the aggregators are defining. One of the quotes rings especially true from the FT article: "Ultimately, the aggregators need the content providers." And not vice versa.


Blogger Stephen said...

Is it not as simple as if you do not want Google or any other search engine to NOT index your pages you add a 'Robots.txt' file and that is that it is done.

If they choose to be excluded they have that option. If they want to be included in a search index they will on occasion like any other internet site be flagged by the search engine as a 'newsworthy' page that has itself listed it as open.

2:00 am, February 02, 2006  

Post a Comment

Links to this post:

Create a Link

<< Home