During the recent Atmosphere 2010 conference Eric Schmidt, CEO of Google, stated that:
Between the birth of the world and 2003, there were five exabytes of information created. We [now] create five exabytes every two days. See why it’s so painful to operate in information markets?”
In March, in its special report ‘Data, data everywhere’ (subscription required), The Economist investigated the extent and importance of understanding the “unimaginably vast amount of digital information" now flowing through systems around the globe:
‘...This shift from information scarcity to surfeit has broad effects. “What we are seeing is the ability to have economies form around the data—and that to me is the big change at a societal and even macroeconomic level,” says Craig Mundie, head of research and strategy at Microsoft.
Data is becoming the new raw material of business: an economic input almost on a par with capital and labour. “Every day I wake up and ask, ‘how can I flow data better, manage data better, analyse data better?” says Rollin Ford, the CIO of Wal-Mart...’
When I tell people that we started creating OBASHI during 2001 and that it has taken almost 10 years to finalise the methodology and the software, they usually say something like ‘You are kidding! But why on earth did you stick with it for so long? ‘
Well, to be honest, it was clear to Fergus and I back then that the practices of Oil & Gas, where digital sensors were attached to every asset, and digital flows (representing product flows) were clearly understood and constantly monitored, would one day become the norm in business generally.
To paraphrase The Economist, ‘the capabilities of digital devices were set to soar and prices plummet, sensors and gadgets would digitise lots of information that was previously unavailable. And many more people would have access to far more powerful tools.’
At the time many businesses in all sectors were becoming crucially reliant on flows of data to perform. A growing number were becoming ‘data refineries’.
But there were problems. New technologies were being piled on top of old legacy systems often without being properly engineered. Staff turnover meant knowledge of poorly documented systems was lost. Businesses worked in silos - nobody could ‘join the dots’ to see clearly how everything was put together to support the flows of data. And that meant that unlike Oil & Gas, businesses could not put an accurate financial value on each flow of data.
Fergus and I realised that as complexity increased CxOs would want greater business clarity. But we could see that communication between IT and other parts of the business would remain difficult without the simple ‘big picture’. Companies, and groups of companies, would be unable to see precisely how they worked individually, and interacted together.
If you don’t know precisely how your business works, how do you know what data is flowing where? And what data flow has stopped? And what data flow has been compromised?
And if these questions cannot be answered accurately it could present major problems for society as a whole, particularly if life-critical or economically-critical organisations had some sort of disaster involving data flow.
At a time when an increasing proportion of money existed digitally (today over 90% of money exists as data) it seemed to us business clarity would be extremely important in the years ahead. So that’s why we set about creating the tools that would enable people to ‘join the dots’ and connect silos, see the big picture, and create business clarity. And to accurately value data flow and the contribution IT makes to the business.
If anything it seems even more relevant today.