I remain puzzled if
not shocked to find so many members of the NonStop community simply ducking the
issue. Raise your hands, crash meetings and do whatever it takes to be part of
your company’s plans – Data from anywhere / everywhere is critical for the
future of providing meaningful results to transaction processing!
“A number of years ago while running extremely high volumes of airline shopping queries through NonStop being monitored by Prognosis from IR; we were literally dumping valuable log data in the bit bucket. Locked within the log data was what the traveling public was asking for, what they were shown, and what they bought with regards to air travel,” observed former Sabre IT executive (and former ITUG Chairman), Scott Healy. “This stream of data, well actually more like a raging torrent of data, could have been analyzed and used to create actionable, real time business intelligence for airlines.”
Data is attracting the spot light and dominating the stage of almost every technology event being promoted these days, and for good cause. What we see today is a greater awareness that answers lie in what we already process – connect the dots, dig deep into historical data and what others are saying, and a more complete picture can be painted. As Healy was only too keen to add, “What would have been interesting would have been to show the airline customers insights gleaned from a day of the log data, or perhaps a few days. Find out what they were interested in (vis-à-vis willing to pay for) and what data should be summarized and stored for future trending analysis.”
A fall-out from recent tragic international events involving aircraft exposed just how much machine to machine communications were already happening. The network of sensors communicating operational information around the clock surprised many unfamiliar with the subject and yet, we have only just scratched the surface. Modern automobiles are producing enormous amounts of data as are the buildings and factories around us. Infrastructure from power grids to the distribution and subsequent treatment of water to the stocking of retailers’ shelves, it all involves data being generated and passed over networks that for the most part, are public.
While storage manufacturers are happy to see this unrelenting upward escalation in the volume of data being generated, there really are limits to just how much data any one enterprise ends up retaining. Policies are already in place that limit the amount of data retained to just a day, or a week, or even a couple of months. However, when it comes to national security, medical research and even public records, arguments can be made to keep every bit of data that passes through the processing environment. After all, forensic mining of data for greater insight into trends has become an industry in its own right.
For the NonStop community this is proving more than a fad. It’s vitally important to all parties that better insight is realized, whether it’s insight about the business, about the market, or even the IT operations itself, acting insightfully is not just an innovative move but rather mandatory one for most enterprises and it’s only with access to data – all the data – that this degree of insight can be achieved. In my most recent discussions with vendors this isn’t being overlooked and has become the subject of initiatives aimed at better addressing this requirement.
Yes, it’s the data, stupid! So no, it’s not acceptable to “log data in the bit bucket”. However, it is equally not acceptable to blindly store everything without due consideration given to the value. This is something I touched on in my past post when I noted how, for many enterprises where NonStop systems are relied upon, the trend is towards “capture, analyze, store”, with mechanisms in place that allow pertinent data analyzed and even stored to be reused by processes at the time additional data is being captured.
I hadn’t yet joined Tandem Computers when the news began to percolate through the industry that Tandem was working on a permanently available implementation of SQL. Overseen by the late, great, Dr. Jim Gray, NS SQL challenged many traditional beliefs about SQL implementations – it’s mixed workload capabilities that allow NS SQL to keep on processing even as DBAs run all-important statistics (needed to keep SQL “in tune) or, as I reported a few years back with respect to one large NonStop user, “maintenance? Truly, we run reorgs, statistics, splits, column adds, etc. all without taking anything down. It’s the NonStop fundamentals!”
NS SQL did exploit the fundamentals but all those years ago, Gray was aware that IT dynamics were evolving and that there would be even more data to process, and that transactions themselves would be even more data intensive. Indeed, after the passing of Gray, Microsoft announced that they would create the eScience Award for the researcher “who has made an outstanding contribution to the field of data-intensive computing”.
Furthermore, according to Microsoft, “In a lecture he delivered just 17 days before he went missing, Jim outlined the increasingly important challenge and opportunity afforded by the availability of previously unimaginable volumes of data and continuous research dedicated to creating new understanding of the world around us.” In other words, even the acknowledged father of NS SQL recognized that ahead of us would be an unrelenting upward escalation in the volume of data and, with these unimaginable volumes of data, the opportunity to provide unimaginable insight into all that transpires in an enterprise.
“Augmenting a transaction and giving it access to additional information for greater insight,” WebAction EVP, Sami Akbay, advised in the last post of last month, I need data I can digest, in small bites, please! before adding, “seems to be a reasonable request and one we are taking seriously.” I am referencing WebAction more often these days as I pursue the story of data and it’s no coincidence that the executive team is made up of former GoldenGate executives. Nor is it a coincidence that they see the need for greater integration of data from disparate sources being a logical next step following data replication and data integration. Like Gray, WebAction appreciates that to fully understand the business world around them, the importance of data cannot be understated.
Mobility, IoT and M2M exchanges, Big Data and Clouds are all involved in data – generating data, processing data and storing data – and it should come as no surprise to anyone in the NonStop community to read of the innovative solutions under development that continue to include NonStop systems. IR, comForte, WebAction and OmniPayments are just a few of the vendors I know first-hand are involved in some aspect of data-intensive computing. We can take our eyes off numerous initiatives and not be hurt too badly but if we become blasé about the importance of data, we run the risk of being blind-sided by our competitors.
Yes, it’s most definitely all about the data, stupid – so pay attention, as there’s plenty of data suggesting this is a development not to be ignored! The picture used (above) wasn't taken at some desolate spot on the earth but rather, nearby; tucked under the shadow of the Rocky Mountains about 200 miles from Boulder and a must-see for even the most casual tourist. And who knows what treasure lies beneath those sandy dunes. And likewise, there are gems aplenty buried within those monstrous data dunes rising from the floors of today’s data centers!
Ever since the first book was published, I have been an
ardent follower of the series of books, Dune, written by Frank Herbert. If ever
there was a tale worthy of the talents of famed New Zealand director, Peter
Jackson, then this is it – once he wraps up the latest Hobbit trilogy, perhaps
he will turn his hand to Dune! Travelling across the galaxies? No problem,
let’s just fold space! Lost a valued colleague? Not a problem either, just
order up a clone! Central to the story line is spice – a very special spice
that’s required to sustain the Mentat, “human computers”, as well as the
Spacing Guild “navigators”.
For those not familiar with the story line of Dune, spice
is the key that unlocks the universe as well as holds it captive; it’s at the
very center of intergalactic intrigue and is only found on one sandy planet,
Dune. Turn a page in any of the books, and there’s barely a paragraph without
some reference to the planet and the treasures lying hidden beneath its sandy
dunes. So it is too today, as we look at the work being done by the world’s
data centers – highly valued business gems lying buried under data dunes.
If we were in the midst of a general election involving all in IT then it would be easy to sum up the issues of the day under the general declaration – it’s the data, stupid! So much of what’s driving technology is in response to the unrelenting upward escalation in the volumes of data that are being generated. No longer can we afford to simply throw data away. It’s scary to think of just how much useful business information fell through the cracks in former times just as it’s equally as scary to think of what we may learn if we pull into our daily operational lives even more data currently accumulating outside of our sphere of operations.
If we were in the midst of a general election involving all in IT then it would be easy to sum up the issues of the day under the general declaration – it’s the data, stupid! So much of what’s driving technology is in response to the unrelenting upward escalation in the volumes of data that are being generated. No longer can we afford to simply throw data away. It’s scary to think of just how much useful business information fell through the cracks in former times just as it’s equally as scary to think of what we may learn if we pull into our daily operational lives even more data currently accumulating outside of our sphere of operations.
“A number of years ago while running extremely high volumes of airline shopping queries through NonStop being monitored by Prognosis from IR; we were literally dumping valuable log data in the bit bucket. Locked within the log data was what the traveling public was asking for, what they were shown, and what they bought with regards to air travel,” observed former Sabre IT executive (and former ITUG Chairman), Scott Healy. “This stream of data, well actually more like a raging torrent of data, could have been analyzed and used to create actionable, real time business intelligence for airlines.”
Data is attracting the spot light and dominating the stage of almost every technology event being promoted these days, and for good cause. What we see today is a greater awareness that answers lie in what we already process – connect the dots, dig deep into historical data and what others are saying, and a more complete picture can be painted. As Healy was only too keen to add, “What would have been interesting would have been to show the airline customers insights gleaned from a day of the log data, or perhaps a few days. Find out what they were interested in (vis-à-vis willing to pay for) and what data should be summarized and stored for future trending analysis.”
A fall-out from recent tragic international events involving aircraft exposed just how much machine to machine communications were already happening. The network of sensors communicating operational information around the clock surprised many unfamiliar with the subject and yet, we have only just scratched the surface. Modern automobiles are producing enormous amounts of data as are the buildings and factories around us. Infrastructure from power grids to the distribution and subsequent treatment of water to the stocking of retailers’ shelves, it all involves data being generated and passed over networks that for the most part, are public.
While storage manufacturers are happy to see this unrelenting upward escalation in the volume of data being generated, there really are limits to just how much data any one enterprise ends up retaining. Policies are already in place that limit the amount of data retained to just a day, or a week, or even a couple of months. However, when it comes to national security, medical research and even public records, arguments can be made to keep every bit of data that passes through the processing environment. After all, forensic mining of data for greater insight into trends has become an industry in its own right.
For the NonStop community this is proving more than a fad. It’s vitally important to all parties that better insight is realized, whether it’s insight about the business, about the market, or even the IT operations itself, acting insightfully is not just an innovative move but rather mandatory one for most enterprises and it’s only with access to data – all the data – that this degree of insight can be achieved. In my most recent discussions with vendors this isn’t being overlooked and has become the subject of initiatives aimed at better addressing this requirement.
Yes, it’s the data, stupid! So no, it’s not acceptable to “log data in the bit bucket”. However, it is equally not acceptable to blindly store everything without due consideration given to the value. This is something I touched on in my past post when I noted how, for many enterprises where NonStop systems are relied upon, the trend is towards “capture, analyze, store”, with mechanisms in place that allow pertinent data analyzed and even stored to be reused by processes at the time additional data is being captured.
I hadn’t yet joined Tandem Computers when the news began to percolate through the industry that Tandem was working on a permanently available implementation of SQL. Overseen by the late, great, Dr. Jim Gray, NS SQL challenged many traditional beliefs about SQL implementations – it’s mixed workload capabilities that allow NS SQL to keep on processing even as DBAs run all-important statistics (needed to keep SQL “in tune) or, as I reported a few years back with respect to one large NonStop user, “maintenance? Truly, we run reorgs, statistics, splits, column adds, etc. all without taking anything down. It’s the NonStop fundamentals!”
NS SQL did exploit the fundamentals but all those years ago, Gray was aware that IT dynamics were evolving and that there would be even more data to process, and that transactions themselves would be even more data intensive. Indeed, after the passing of Gray, Microsoft announced that they would create the eScience Award for the researcher “who has made an outstanding contribution to the field of data-intensive computing”.
Furthermore, according to Microsoft, “In a lecture he delivered just 17 days before he went missing, Jim outlined the increasingly important challenge and opportunity afforded by the availability of previously unimaginable volumes of data and continuous research dedicated to creating new understanding of the world around us.” In other words, even the acknowledged father of NS SQL recognized that ahead of us would be an unrelenting upward escalation in the volume of data and, with these unimaginable volumes of data, the opportunity to provide unimaginable insight into all that transpires in an enterprise.
“Augmenting a transaction and giving it access to additional information for greater insight,” WebAction EVP, Sami Akbay, advised in the last post of last month, I need data I can digest, in small bites, please! before adding, “seems to be a reasonable request and one we are taking seriously.” I am referencing WebAction more often these days as I pursue the story of data and it’s no coincidence that the executive team is made up of former GoldenGate executives. Nor is it a coincidence that they see the need for greater integration of data from disparate sources being a logical next step following data replication and data integration. Like Gray, WebAction appreciates that to fully understand the business world around them, the importance of data cannot be understated.
Mobility, IoT and M2M exchanges, Big Data and Clouds are all involved in data – generating data, processing data and storing data – and it should come as no surprise to anyone in the NonStop community to read of the innovative solutions under development that continue to include NonStop systems. IR, comForte, WebAction and OmniPayments are just a few of the vendors I know first-hand are involved in some aspect of data-intensive computing. We can take our eyes off numerous initiatives and not be hurt too badly but if we become blasé about the importance of data, we run the risk of being blind-sided by our competitors.
Yes, it’s most definitely all about the data, stupid – so pay attention, as there’s plenty of data suggesting this is a development not to be ignored! The picture used (above) wasn't taken at some desolate spot on the earth but rather, nearby; tucked under the shadow of the Rocky Mountains about 200 miles from Boulder and a must-see for even the most casual tourist. And who knows what treasure lies beneath those sandy dunes. And likewise, there are gems aplenty buried within those monstrous data dunes rising from the floors of today’s data centers!
Comments
It also occurs to me that OLTP systems have the same opportunity to "enrich" buying experiences in a similar way by adding real-time analytics to the transaction flow and posting relevant "opportunities" as part of the order completion message.
Do you have any recent examples Dean of NonStop users adding such analytics? I am more than interested in this matter :-)