Thursday, November 26, 2009

News Corp vs Google is merely a battle...

... the war is between old and new.

To quote Mark Sigal from O'Reilly
Analog (old) media is all about managing scarcity by controlling distribution
Digital (new) media. ... content, in tandem with un-tethered distribution and pretty good search/retrieval functions, operates in complete disregard for the old media-based pricing models that preceded it

And as we know, when they meet.... the results are very disruptive!

Its therefore an analog vs. digital fight we are in the midst of. Up until now, digital has won over the protectionist activities of 20th Century analog-based business models.

The hard-fought battlegrounds have been:
  • music
    A clear win to digital, where iTunes and the MP3 format were the tank and gunpowder
  • small ads
    Another clear win to digital, with a soldier called Craig and the mighty eBay the victors
  • radio
    A draw, where radio has been allowed to live (but limp in its damaged vehicle with a license that could expire at any time in the future)

But.... the war is getting closer and closer to the centre-piece of the analog media's territory.... television. So the analogs have drawn their line of battle , along the far-reaching fields of newspaper control.

It is here they have decided to stand and attack back, to preserve what they have and fight one last and (possibly) long battle... or risk losing everything (AKA: mainly the value of their shareholdings in their companies).

Tuesday, November 24, 2009

Is delisting from Google a financial model?

I've recently mentioned the Forrester report that states that 80% of Internet users would not bother to pay for newspaper content online. Yet the belief at News Corp is that there is still a model to be had from sticking up a pay wall and hiding your content from Google (and charging Microsoft for the privilege of displaying your content).

Perhaps this is the "rewriting the economics of newspaper" that James Harding, Editor of The Times talked about last week? Here he talked about charging a fee for a 24 hour view of the newspaper online (figures of around £1 have been mentioned, but obviously an annual subscription would be less than this).

But is there a possible financial model to had from de-listing from Google? Well... Bill Tancer at Hitwise has done this work, albeit taking just one News Corp newspaper as an example... The Wall Street Journal.

His findings are that although gets over 15% of its traffic from normal search (which quite possibly isn't that valuable to Mr Murdoch, as this is predominately brand searches, not searches for content), it gets 11% of its traffic from Google News. Now that must surely contribute to 11% of all landing page advertising inventory and any successive page that the visitor then moves onto?

So.... unless Bing can provide this sort of replacement revenue to News Corp.... how are they hoping to make more money (not less) from de-listing from Google and going to a Bing-only search engine model?

Obviously the answer is they are hoping to make further money from their paywall subscriptions. Although the WSJ may need to seriously consider whether its readers will be happy about paying for their paper and well as having to deal with intrusive/distracting in-page advertising.

And now proof of the UK paywall model for local content is about to be put to the test. Today, news has surfaced that local newspaper publisher Johnston Press will start an experiment to charge for access to its weekly websites from next week, although they have yet to announce they are de-listing their content from Google.

What would delisting news from Google mean?

With the battle between News Corp and Google (in reality the whole of the open web) now warming up, thanks to Murdoch and Microsoft getting cosy, it is perhaps prudent to take a look at exactly the impact that delisting news sites from Google would have.

This is exactly what a German investigation unearthed when looking at what the effect on would be if most of the country’s publishers delisted their content from it.
The results found:
five percent of the top 10 (Google search) results came from the news organisations - and this is with publishers co-operating with Google.
It is therefore likely that the effect on Google would be minimal and the content gap would simply be filled by other news sources. This would especially true if it was only News Corp content unavailable to the whole web and put it behind a paywall or available via

Online market shifts from PPC to meta search

An article from Travolution caught my eye recently.

Apparently there is a view that some of the online travel marketing spend is moving away from PPC (Per per click) advertising in the search giants to metasearch travel sites (e.g. Kayak).

According to Wouter Blok from European Hotel site he has seen conversions quadruple from such sites.

However, is this conversion rate standard for the rest of the industry or has Mr Blok just not used PPC correctly?

Monday, November 23, 2009

News Corp and the battle to charge for content

Its been several months now since Rupert Murdoch told a press conference that he planned on charging for his papers by next June. However, he has recently informed those that will listen that this deadline may not be likely any more.

However, this battle to make content on newspaper sites such as The Times and The Sun chargeable, looks like its dragging in other media players....well, it has to, or else the existence of free news elsewhere will mean that most people simply won't visit news sites if it costs them.

First it was The Telegraph group that Murdoch indicated that he was in discussions with, when he told a Telegraph journalist (who surprisingly didn't report this slip) what he was up to.
Note: As Alan Greenslade points out, I'm sure its more than a little anti-competitive to have discussions with your opposition, as well as being somewhat foolhardy to admit to it for all regulatory bodies to hear.

Now it looks like Rupert is talking with Microsoft in the hope that Steve Bulmer will pay him money for his content if he removes it from Google listings.
As so many web commentators have pointed out... It is extremely easy to de-list all News Corp content from Google, by sticking a small file containing a single line of code on each website. But surprisingly, despite calling Google names (he's obviously run out of sticks & stones this year, perhaps after losing so much on MySpace)... this instant change hasn't been done. Perhaps News Corp, even temporarily needs Google!)

Yes, soon Bing could contain News Corp's lovely content. This could increase its share of the search engine market and now giving it potentially more tabloid news to display in local searches. I can't wait!

Friday, November 20, 2009

Dow Jones new tool to engage bloggers and journos

For too long PR and other media relations people have needlessly spammed journalists and (more recently) bloggers with their irrelevant releases.

Emails titled "For Immediate Release" constantly fall into our inboxes in the hope that we will write about the latest product or service. But most are sent with little actual targeting of their subject or audience and are quickly dismissed and deleted.

But with the release of the Dow Jones Media Relationship Manager ( this could all change..... hopefully.

This tool apparently understands what journalists and bloggers are covering and enables them to be contacted with relevant and personalised messages.

One does wonder if this is an automated or human process, how often their index is updated and therefore the ways that bloggers & journalists can be entered onto / removed from this list, or else it may up as yet another spamming tool that will automatically end up in the deleted folder, saving a lot of us the effort of putting there ourselves!

in reference to: Dow Jones Media Relations Manager (view on Google Sidewiki)

Thursday, November 19, 2009

a billion dollars, one nickel at a time

Now we know newspapers are in trouble. Ad revenues are declining, readership is dwindling and they are gradually being left behind as a trust source of unbiased reporting (not to mention the debts they have from bad investment decisions made less than a decade back, when they didn't read the writing on their 'Facebook' walls*). So forget them making serious money quickly again, in fact... forget them making any serious money over a longer time period either!

* Yes, I know Facebook wasn't round a decade ago, but you understand what I mean!

And who do they blame for their troubles? Well.... Google of course (my particular favourite is Robert Thomson, editor of The Wall Street Journal, calling search sites such as Google “tapeworms.” - I jest-ye-not!).

Its all their fault for, errr.... ;
- allowing people to search openly across a number of sites & archives
- aggregating their headlines alongside other news sources
- making content more transparent (AKA removing the smoke & mirrors)
- providing links back to their sites
Yes, the site that takes tiny amounts of money from advertisers per use is the biggest threat to newspaper revenue and existence.

However, can the newspapers learn from Google? Can they actually create and maintain a local or hyper-local news search service that would rival and better the search giants?
Surely they have the content and could (or just perhaps could have once) allow people to search openly across a number of their own content, etc. and could also charge micro-payments for this. Or to quote author and blogger John Battelle, newspapers once had the chance to earn "a billion dollars, one nickel at a time".

Tuesday, November 17, 2009

Page response times

Our work at Ideal Interface leads us to speak with many different clients who want a leading ecommerce website. So whilst specifying this solution we try to define the functional requirements (such as what interactivity the page needs to provide) along with the non-functional requirements... such as what the page download / response times should be.

Now... in my experience this actually takes some explaining to define what you actually mean and how you going to measure it before you are likely to get agreement from the client.

Firstly, its important to understand that all things on the Internet are definitely not equal. Connection speeds (bandwidth), latency, the browser you are using and the speed of your device all contribute to the differences between one user's experience and another.

Q: So, what is an acceptable page download time?

A: This depends on who you ask

For a long time I have used the words of Jakob Neilson, the web's foremost usability expert, who has tackled this subject several years ago. He gives the timescale for user attention between 1 and 10 seconds, after that user flow is broken and users tend to leave the site:

More recently, two seconds has been given by Akamai as the new average of an online shopper’s expectation for a web page to load:
(I guess you would expect this from a company who provide fast Internet delivery services!)

However, there is no doubt in my mind that a user's expectation of page download times is gradually increasing. So just because bandwidth speeds are increasing, there is no reason for site owners to increase page size accordingly (or to provide complex or badly-written client-side code that takes ages to render in the browser).

Monday, November 16, 2009

UK eCommerce continues to get more usable

In the run up to Christmas, Online Usability agency Webcredible have released their latest report that evaluates the websites of 20 of the UK's leading high street retailers. The Online usability research shows that once again WHSmiths have retained the top spot, but this year have been joined there by Marks & Spencer.

The report makes fascinating reading, not only with some great insight (e.g. how Waterstones provides contextual search within a specific product category - which I find particularly useful) but also that it still misses out some of the major online high street players (e.g. River Island).

Sunday, November 8, 2009

Online retail will be 20% of all UK sales

Despite the economy shrinking by 0.4% in the last quarter, by 5.9% overall in the last year and now with the UK officially entered its longest period of recession in 54 years, there's some good news....

According to a report from Kelkoo recently, £8.9bn will be spent online this Christmas, that's 20 pence for every £ spent!

In a report by the Centre for Retail Research on behalf of Kelkoo, online shopping is anticipated to grow by 24% on last year!

More here:

Thursday, November 5, 2009

Google enters the commerce search arena

Yesterday Google announced their own Commerce Search application which competes with the likes of Omniture Merchandising (previously Mercado Search)

This is an enterprise product search service that is hosted by them.

It is similar to Mercado, in that is has:
- Facetted navigation
- Business rules for prioritisation
- Spellchecker, synonyms & recommendations
- An XML API for exporting results back into your own website

It even has a built-in shopping cart if required and pricing is based on number of products/SKU's and the number of searches.

Wednesday, November 4, 2009

Google interview questions

  • How many golf balls can fit in a school bus?
  • You have to get from point A to point B. You don’t know if you can get there. What would you do?
  • How much should you charge to wash all the windows in Seattle?
What do these questions have in common? Well they are apparently part of the series of questions you get asked when you get interview for a job at Google.

There are various versions of these questions around the web, such as:

Tuesday, November 3, 2009

£10,000 a year plus £155 per B2B user

That's the fee that the Newspaper Licensing Agency want to start charging NewsNow... (for now).
"I don't think it is being unduly greedy to suggest that some of this comes back to us on behalf of the organisations creating the content"

states Andrew Hughes NLA's Commercial Director (note the use of "unduly" there)

But what reciprocal value does the NLA place on the links to its member sites? If only this value could be quantified...

in reference to:

"NLA's commercial director"
- (view on Google Sidewiki)

Monday, November 2, 2009

Newspaper Club - limited print idea

Newpaper Club is a service due to be launched early in 2010 with the aim of building a service to help people make their own newspapers.

In a digital age, where some newspapers are closing and most are losing money, its possibly surprising to see a print-based proposition being launched. However what makes this service different is the product... a limited run (five to five thousand in quantity) tabloid-style paper in a generic dozen page format... but printed at existing commercial presses.

Could this be the final manifestation of the hyperlocal newspaper?

Well possibly and already the Cabinet Office has tasked them with producing a publication just for one East London postcode.

in reference to:

"Newspaper Club"
- Newspaper Club Helping people to make their own newspapers (view on Google Sidewiki)