May be it’s coincidence, but last few weeks I had been having more conversations or, reactions to escalations than others. Project in red, Performance of an individual not being up to the mark, Further failure will result in loosing the opportunity and client, Questions on who owns the program, solution and delivery, Having a rock-star to manage one or two of such programs etc. has been some of the discussions.
Zooming out, I found varying levels of leadership qualities, successes and failure.
Plan to share my views on these. Look for them in this space.
Previous post Big Data beyond the hype, introduced 3v’s – Volume, Velocity and Variety. In this post, let us look as why volume matters.
One size does not fit all
If you are a data architect or a data modeler, when you design a system or model a business process the first order of business would be understand the data-flow. Such an exercise would turn toward understanding data volume. Second aspect of that is, in cases where such models have been previously built, data architects and modelers will periodically check to see the ‘current applicability’ of their model(s) where data tends to (constantly) grow and change.
Is this important and if so why ? Yes and here is why.
A data model is not set on a stone
The assumption that once a business process has been modeled, it will stand time as businesses develop is wrong. Even when there are no changes to the model as is – that is the business expansion and process changes has not influenced one, except volume – data volume alone could dictate revisiting the model and in cases it will dictate changes as well. A fully functional data model might bring the business processes and system(s) to a grinding halt, if it does not address changes in data volume and misses out on volumetric’s.
Current state of data
Companies big and small rely more and more on data collection on more aspects of their business, more about their customers, more about their buying patterns. E-commerce and Mobile revolutions has made it possible for businesses to get a microscopic view of a transaction along with the profile of the buying or interested customer. Mixed with the influence of Social Networks, it is a data deluge.
More than ever, businesses are collecting information of all kinds – tags and identifiers, signals and readings collected from machine-parts, location coordinates from mobile devices, transactional data, customer demography, selling medium, buying patterns, geographical trends, effective promotions, cross-sell influences etc. That brings in a lot of data. That brings in a lot of stress on poorly designed systems and data models.
Advancements in Data Modeling
A data model reflects business, business processes and is constructed to efficiently manage data that is collected. Efficiency is measured when the business is able to get actionable intelligence out of it. Data architects are exploring, innovating and introducing news way to model information systems. As business grows, the data model evolves into ways in which it can address and manage the change and still serve the business analysts and data researchers of the organization. No longer is a modeler confined to a singular model or a structure – ER or others. Advancements in DBMS (Database Management System) also makes it possible to harness the power of underlying hardware – Client-Servers out of commodity hardware parts or Appliances which are built for a specific purpose.
Solutions after modeling
When the data volume grows into terabytes and petabytes, modeling demands newer approaches and solutions. As the legacy models while they solved and still solve the problems of the world, those problem statements were different. Expectations from such systems were different and mostly limited (in size). And so expansion (using such solutions) is limited too. Newer expectations are a different problem to solve. And hence they demand newer solutions.
And this is the case for Big Data solutions to deal with volume (one of the three V’s). We will see velocity and variety in detail and revisit volume to explore as how Big Data solutions deal with them all.