Naturally, this title caught my attention.
Intrigued, I read on.
The article argued that with the advent of Big Data, we no longer need to waste effort putting together, possibly complex and time consuming data models. In this transformed data landscape, everything can be developed rapidly to meet a specific set of requirements. As soon as the next set of requirements arises, if the previous development does not fit them, it will be jettisoned and replaced.
Part of me wondered at this thinking; can the world’s organisations now really operate in this way because the data landscape been so utterly transformed by Big Data?
Datageddon?
I remembered back to similar articles I had read over the years. Each repetition of this death knell has occurred with the advent of a major new paradigm on the data landscape. Examples over the years include; the relentless advance of Anything as a Service, the Internet, Agile development, the Cloud and, of course, the Data Lake.
But still the data model has survived.
My mind started to sift the evidence that I have personally witnessed – Big Data certainly has had a significant impact on organisations that require data to support their operations, and therefore also on their data models. But I can see no indication that even this seismic shift in data collection and analysis has caused these core organisational definitions to die out. In fact, for many organisations, a key outcome of adopting Big Data has been the exact opposite; it has resulted in the realisation of their importance.
The explosion of technical innovations that have transformed data usage by organisations, has fundamentally altered the way that data models are required to support them in this data-rich environment. Whereas in previous times, they may have been restricted to being viewed as an ‘unwelcome’ but necessary part of development, they are now being recognised for what they truly are; a definition of an organisation’s operational lifeblood. There is now a realisation that data models allow an organisation to ‘know thyself’.
Data Models Core Role
Arguably, they are now more important than ever. It is only with a full and agreed understanding of the ‘What?’, ‘When?’ and ‘How?’ of an organisation’s data structures and flows, that we can contemplate plugging COTS products together, implementing in the Cloud, ingesting data into our Data Lakes, or reporting across the Enterprise system landscape.
In the last decade, data models have made the transition from being ad-hoc and limited in scope, to becoming a central pillar of the Enterprise Data Architectural landscape. Thus, a beneficial by-product from the adoption of Big Data is the realisation that its true benefit cannot be delivered without being able to correlate the meaning from Big Data analysis with the organisation’s Master Data Domains.
This has driven the focus for organisations to be able to bring their Master Data under control and thus the processes that manage it. So, I took what I could from the article, finished my sandwich, and resolved that it was now time to start writing the book that I had always wanted to. Its purpose would be to:
- Explain the power of data models
- Describe the easy steps required to define and quality assure them and
- Define the processes that harness their power to deliver maximum benefit
I finished the book a few years ago. It is entitled “The Data Model Toolkit – Simple Skills To Model The Real World” (ISBN – 13: 978-1782224730) and is available at all the usual retailers including Amazon UK and Amazon US.
It is the companion in a growing series including:
- “Enterprise Data Architecture – How to navigate its landscape” available from Amazon UK and Amazon US.
- “datagility – powering success for tomorrow’s organisations” available from Amazon UK and Amazon US.
I originally published this post on LinkedIn in March 5th 2017, but it could have been written in 2007 and I feel is just as relevant today.