A.P. Cracks Down on Unpaid Use of Articles on Web

Will be curious as to how this shakes out.

Taking a new hard line that news articles should not turn up on search engines and Web sites without permission, The Associated Press said Thursday that it would add software to each article that shows what limits apply to the rights to use it, and that notifies The A.P. about how the article is used.

Tom Curley, The A.P.’s president and chief executive, said the company’s position was that even minimal use of a news article online required a licensing agreement with the news organization that produced it. In an interview, he specifically cited references that include a headline and a link to an article, a standard practice of search engines like Google, Bing and Yahoo, news aggregators and blogs.

[Click to continue reading A.P. Cracks Down on Unpaid Use of Articles on Web – NYTimes.com]

Alternative Google

Websites like Google are going to be in for a bit of a dustup

Search engines and news aggregators contend that their brief article citations fall under the legal principle of fair use. Executives at some news organizations have said they are reluctant to test the Internet boundaries of fair use, for fear that the courts would rule against them.

News organizations already have the ability to prevent their work from turning up in search engines — but doing so would shrink their Web audience, and with it, their advertising revenues. What The A.P. seeks is not that articles should appear less often in search results, but that such use would become a new source of revenue.

Right, there is a simple addition that webmasters can add to their site that tells Google’s automated indexing software to “go away”:

The robot exclusion standard, also known as the Robots Exclusion Protocol or robots.txt protocol, is a convention to prevent cooperating web spiders and other web robots from accessing all or part of a website which is otherwise publicly viewable. Robots are often used by search engines to categorize and archive web sites, or by webmasters to proofread source code. The standard is unrelated to, but can be used in conjunction with, sitemaps, a robot inclusion standard for websites.

[Click to continue reading Robots exclusion standard – Wikipedia, the free encyclopedia]

Not a Good Sign

If A.P. did that, they would lose search engine generated traffic, but that isn’t really what A.P. wants. A.P. wants traffic, and to be paid for the traffic. I doubt it will happen as seamlessly they want, but we’ll soon see. Newspaper executives also don’t like blogs much:

Executives at newspapers and other traditional news organizations have long complained about how some sites make money from their work, putting ads on pages with excerpts from articles and links to the sources of the articles.

but I don’t know if that particular genie could ever be crammed back into its bottle; the bottom of the bottle is missing, and digital content flows wherever it can, instantly.

and this is puzzling:

Each article — and, in the future, each picture and video — would go out with what The A.P. called a digital “wrapper,” data invisible to the ordinary consumer that is intended, among other things, to maximize its ranking in Internet searches. The software would also send signals back to The A.P., letting it track use of the article across the Web.

If someone cuts and pastes an A.P. article from some other site, how is this magic technological bullet going to still be attached? Either there is more to the process than the A.P. admits, or else they are really deluded1.2

Footnotes:
  1. not that it matters, but John Gruber, always an astute observer of these sorts of matters, agrees with me []
  2. corrected the URL, oopsie []

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.