Thursday, August 28, 2014

A time to put the hammer down! NonStop accelerating adoption of Clouds and Big Data …

Do we really need to see the big guy wielding a mighty hammer, or are areas of focus for many in IT already being covered by smaller blows leveraging what we already have?

Boulder has been hit with some amazing thunderstorms of late, with another one passing overhead early this afternoon. Rather fortuitous in some respects as I took time to have a look – boiling clouds with amazing displays of nature’s powers as lightning flashes lit the sky. It was a big display of nature’s awesomeness. Lightning, disrupting the normality of an afternoon; clouds, the source of tremendous energy; no matter where you stood, the display was on a grand scale – big, by anyone’s measure.

At the macro level, prairie thunderstorms are majestic to watch whether you are out in the open or tucked up alongside a mountain. However, stepping away from nature there are still amazing sights to take in that are man-made. To me, the unrelenting drive to miniaturization blows my mind – there’s just no real way to equate what is inside a modern chip to anything we can see in the real world. When quantum physicists talk of size where 2 to the power of 512 is referenced in the same breath as being more than the sum of all atoms in the known universe, then yes, we have come a long way from a system my father showed me in the mid-1960s that had 8K bits of memory, yet still could store a program.

All of this is leading to the topic of the week and the two subjects are increasingly becoming related – Clouds and Big Data. A short time ago I started a couple of LinkedIn groups aimed at the NonStop community. Clouds, powered by NonStop, that now has 271 members and is a couple of weeks old and Big Data, integrated with NonStop that has fewer than 40 members but is relatively new.  Including NonStop in the names of both groups was intentional, as I wanted to make sure any new member of the NonStop community that searched for groups focused on NonStop would see that there is interest within the community in both Cloud and Big Data. If this is all new to you and as yet, you haven’t joined, then perhaps you should take a few minutes and do so.

While the community is concerned about the growing number of LinkedIn groups out there, its highly intentional on my part to project an image of NonStop that may surprise folks – of late I have become selective about the ones I do end up joining and the metric most important for me is the volatility of the group – just how much is going on. If you have seen some of the discussions and the passions they arouse on groups like Tandem User Group and Real Time View, you will know what I mean. Clouds and Big Data are very important for all stakeholders in the NonStop community. Many businesses have embarked on modernization initiatives so I have to ask you – do your modernization initiatives actively involve Clouds and Big Data?

As a place to start, did you know that vendors, well known to the NonStop community, have begun taking steps along the path to clouds. “The model today is cloud computing is inevitable even among the financial institutions I count as customers,” said OmniPayments, Inc. CEO, Yash Kapadia. “It’s inevitable because it represents a more sensible approach to providing the best value for the dollars spent. Too often we size systems and include software with expectations of the volume and type of transactions a customer will face but this can lead to erring on the conservative side with customers ending up paying more. And this is not what we want to do at OmniPayments. Elasticity of provisioning as well as providing capacity on demand is just the latest way to express a need for flexibility and with cloud computing, whether the cloud is on site and private, or is on our premises and managed, it is simply a way to better leverage commodity hardware and open systems.”

The productizing of the demo that has been showcased at the last couple of HP Discover events by Infrasoft as maRunga is just one more proof point. To those embarked on modernization projects where the volume of searches is high, as their clients spend more time looking up items, then maRunga is an easy first step to take on their path to clouds. And solutions for NonStop that include support of clouds are already being deployed. “OmniPayments has already been sold as a cloud offering in the America’s and is in an early stage of customer acceptance. In this first instance, there’s both a private cloud on site as well as a managed cloud on our premises for back-up purposes. And yes, NonStop systems are an important part of both cloud computer configurations,” added Yash. “Are we playing liberties with the concept of cloud computing? I don’t believe we are as the cloud we externalize to the customer is a collection of processors the specifics of which aren’t visible to the customer.”

The arrival of big data comes at a crucial time for the industry. In an always-connected, online world it’s imperative that we have the right information at our fingertips to sway our clients to more favorably consider what we offer. So often we emphasize two of the Vs associated with big data – volume and velocity – but timing is of the uttermost importance when integrating the world of big data with transaction processing, NonStop’s forte. “From our perspective, we should be talking about ‘time value of information’”, said Managing Consultant, Enterprise Solutions & Architecture at Hewlett-Packard, T.C. Janes. “Time is the enemy of data.”

“Big Data is driving much of the agenda of IT of late – business is fully aware that they need better insight into their business and that only comes when meaningful information can be extracted from the reams of data accumulating all around them,” said WebAction, Inc. Cofounder Sami Akbay. “However, the need isn’t something that can be addressed without understanding that relevant and applicable data, meeting business criteria, needs to be captured and repackaged for consumption by business logic central to the running of the business. When NonStop is taken into consideration this is particularly important as it’s often NonStop applications interacting directly with customers – the very community about which better insight is critical.”

I am particularly interested in the work that WebAction has taken to date and in delivering the WebAction product, those in the NonStop community embarking on a modernization revamp, have a very viable solution on hand,  and one that absolutely understands the importance of time. In the August 20, 2014, post 
Value Networking? Nonstop Community Certainly Does! to the WebAction blog, I wrote of how WebAction, by the very nature of the control it ultimately gives to system architects can generate web actions as verbose or as targeted as we need – it’s really the ultimate control gate when it comes to turning on the flow of big data to time-sensitive mission critical applications.  A really big hammer? Yes! A very small hammer? Yes, again!

“Certain types of data have enormous value at its moment of creation but may have less value an hour from now and perhaps no value tomorrow. Stock traders have understood this from time immemorial – if you can identify and act on a transaction opportunity faster than anyone else, you will benefit the most,” added HP’s Janes. “If you can synchronize data arrival velocity with business process velocity, an organization can sell more product, deliver better customer service and capitalize on new business opportunities sooner than their competitors.  Inversely, an organization does not necessarily want to act on every potential black swan “event” if their evaluation of aggregate data yields a contradictory model.”

There is another side to clouds and big data, and that’s the likely intersection of technologies. When it comes to big data and cloud computing then according to HP Master Technologist, Justin Simonds, “I think there will be a collision, at least for a while, as organizations attempt to derive meaning out of every piece of data they can get their hands on.  Cloud is the only ‘price reasonable’ way to churn through all that stuff.” Furthermore, added Simonds, “I believe there will be a falling away as attempt after attempt yields little business insight.  I see the industry adopting specific products for specific requirements. NonStop, as mentioned, is still extremely well designed for the operational analytic, real time velocity events. Vertica still holds the top magic quadrant spot for deep / big data analytics and for all its bad press Autonomy has some very good IP in the audio/video arena.  I believe multiple solutions are the only way to effectively proceed.”

Simonds observations are shared by comForte’s CTO, Thomas Burg. In an interview for an upcoming post to the blog, comForte Lounge, Burg observed that when it comes to “clouds and big data, there almost nothing that is really new here either. Cloud computing is little more than old wine in new caskets where the only issue is one of business best-practices. Can we run this application in a cloud? Can we store this data in a cloud? Security? Bring it on! We have the technology and know-how, we are happy to talk to prospective cloud adopters. Big data is a similar challenge but somehow, many NonStop users see only a big hammer when only a small hammer is needed.” On the other hand, “We continue to include NonStop in our plans for WebAction as there’s no question, serious online processing is occurring on that platform,” added Akbay. “To ignore NonStop would imply we aren’t taking the integration of Big Data with real time processing seriously and that simply isn’t the case.”

It as the Thor, who “put the hammer down”, as Captain America unfortunately requested of him in the film, The Avengers. Watching the lightning storms it wasn’t hard to imagine legendary beings as the source of such a spectacle of nature. Fortunately, when it comes to IT and NonStop, it’s a little less intense but every bit as important particularly as so many within the NonStop community are actively engaged in modernization initiatives. In time, of course, we will find broad exploitation of both technologies across the NonStop community (just as we did with Client / Server computing as well as with SOA and Web services that followed) with much of what is covered here generating little more than a yawn. And if you want to stay informed about what’s important to the community then yes, check out the new groups on LinkedIn and become a member – I look forward to all the comments you care to provide!

Wednesday, August 13, 2014

Sun, surf and sand – oh, the good-old-days!

With the sun beating down on my face and with a gentle ocean breeze stirring my remaining hair, I kept returning to thoughts of an independent NonStop company – free to pursue whatever it liked. But with the news that just broke … I’m not so sure this would be in the best interests of the NonStop community!

There’s something about the good-old-days that strikes a chord with many of us. For me, nostalgia is an emotion hiding just beneath the surface and perhaps it’s just a reflection of the many experiences I have had through the years. It takes very little to trigger a rush of memories bursting up from the depths, but a glimpse of the sun, a rolling surf and a sunny day usually does the trick! In the posts to this blog I have shared many of my memories and nearly always, glimpses of these memories were a result of something I had just read.

After two plus decades being involved with the NonStop community I have to admit that many of these memories are intimately tied to events that occurred while spending time with the community. The highlights of course were the years spent on the board of ITUG and I am very aware that those good-old-days are long gone, with little prospect of ever being repeated. Like almost everything else that gets tarnished by the label of legacy, I have come to realize that in today’s world where everyone is connected, relying on an annual big-tent gathering of the faithful for insight into product directions and technology adoption doesn’t need to exist, not in the same format it did for so many years.

Like many of the stakeholders participating in today’s NonStop community, we have watched the NonStop R&D group being trimmed. There’s been good reason for much of what’s been cut and the figures can be a little disconcerting. However, looking at NonStop R&D today it is clearly not comparable with what we remember back when it was Tandem Computers – the way HP is organized, tasks have been scattered throughout many groups. While it is good to know that the finances of the good ship, NonStop, have been righted and the contribution NonStop makes to HP’s bottom line is not something to quibble over, like everyone else I sure would like to see more funds allocated to NonStop.

Ah, the memories! They keep coming back even as I think back to Friday beer busts and afternoons in the Tandem pool. The printing of tee shirts for each new project and the Tandem television network with
First Friday videos – now available on YouTube – the sense of shared missions and the recognition that quarter after quarter, major enterprises were buying new systems. However, perhaps the best news of all escapes many of us. In a world of off-the-shelf commodity components, NonStop remains relevant.

When it comes to providing the highest levels of uptime, it’s still the halo product in HP’s portfolio – yes, as HP CEO, Meg Whitman, so succinctly summed up in a video at last year’s NonStop bootcamp, “Today, enterprises operate in a world where the demand for continuous application availability is growing exponentially. The need to choose the right computer for the right workload at the right economics has never been so important … we are on the path to redefine mission critical computing.” And, every bit as importantly for those attending, “Our NonStop customers truly make it matter!”

I have referenced this quote by Whitman several times this year – to posts here as well as in other blogs – and this quote remains as fresh in my mind as when it was first made. Choosing the right computer for the right workload at the right economics frames the discussion for NonStop now, and in the years to come. Step outside that framework and fail to meet any of criteria referenced, and the future for NonStop wouldn’t be as solid as it looks right now. Yes, we continue to kick around the impact that The Machine will have as the decade comes to a close, but it’s hard to overlook that even with a brand new OS, consideration will more than likely be given to attributes uniquely NonStop and to the enterprises that depend on NonStop.

Against this backdrop of NonStop and the memories I have of the good-old-days, it was with some disquiet that I read the news item in the August 1, 2014, edition of the Wall Street Journal. Under the heading of
Deal With H-P Paves New Future for Old Software the WSJ reported that HP had “agreed to let a small Massachusetts company – VMS Software - take over further development of OpenVMS, an operating system that originated at Digital Equipment Corp. in 1977. DEC no longer exists, but its technology has lived for years under HP’s ownership and still has passionate users.”

Furthermore, following a couple of announcements HP made last year to do with future ports of OpenVMS (to faster Intel chips) not happening, the WSJ  said the “H-P decision stunned organizations that use the software to run sensitive applications, in places like stock exchanges, manufacturing lines and chemical plants.” It then quoted VMS Software’s CEO, Duane Harris, as saying, “Everybody was in a panic,” and that users felt they “suddenly had no future.”

For quite some time there have been several stakeholders with lengthy ties to NonStop thinking that it may be a good thing to approach HP to see if there would be interest in splitting off NonStop and giving it to a company solely focused on its future. On paper, such an idea had merit for any entity with deep enough pockets to fund needed R&D “in perpetuity”. However, nothing developed and as I look at this story in the WSJ I am so glad nothing did eventuate. If there could be a stronger message to any community than yes, you are on your own, I don’t know of one and while we all harbor doubts about the performance of HP over NonStop, the good news is that NonStop remains an integral part of HP!

My loyalties have wavered through the years. Readers may recall my post of August 30, 2011,
Stories we could tell … I was standing in the offices of John Robinson, CEO of SDI (NET/MASTER) when I received offers from both DEC and Tandem (of course, electing to join Tandem Computers) and today it seems as though working on the fringes of HP was pre-determined! However, any thought that the future of DEC would finish up in the hands of a small software group in Massachusetts was unimaginable and yet, here we are today watching the winding down of a once mighty player on the computer stage.

There are a couple of sayings that come to mind at this point. Memories being what they are, I am not all that sure where I first heard them but family does come to mind. “Never set your goals too low in case you achieve them” is what I immediately thought of as I continued reading the WSJ story. For the NonStop community thinking big is still the objective and any thoughts I may have once had about the benefits of separating NonStop from HP are long gone. The simple truth is that to maintain a global reach and to ensure best usage possible of commodity items, it takes a company the size of HP to bring the resultant products to market in a cost-effective manner.

“The right computer for the right workload at the right economics,” seems such a simple observation and yet, with HP giving away OpenVMS as it has done (and as good a technology as OpenVMS had been), it apparently no longer was the right computer for the markets it served. IBM faces much the same dilemma with its midrange computers, the strangely morphed Power Systems (including what formerly was known as the System i and before that, the eServer iSeries and, going even further back, the AS/400 that has family ties back to the System/38 that appeared around the same time as the first Tandem computer) and speculation is rife that it too will end up in the hands of others apart from IBM. NonStop continues to retain just enough “special sauce” to differentiate it at times when even the most adventurous of us have thought NonStop surely couldn’t continue – but it does!

The good-old-days are gone and gone with them are the difficult times of programming complexities with limited connectivity. What I recall as being part of the good-old-days had little to do with technology and more to do with much of the social activities (loved the 1980s!) and as much as I muse about what doesn’t exist any longer there’s no escaping the leaps that have been made in productivity. The choice of platforms remains rich and the opportunities to innovate almost limitless and so, having NonStop in the picture, an integral part of HP and continuing to contribute. Even as I wish the folks at VMS software well, I no longer harbor wishes for NonStop to follow suit and look forward to better-new-days ahead! 

Monday, August 4, 2014

It’s the data, stupid!

I remain puzzled if not shocked to find so many members of the NonStop community simply ducking the issue. Raise your hands, crash meetings and do whatever it takes to be part of your company’s plans – Data from anywhere / everywhere is critical for the future of providing meaningful results to transaction processing!

Ever since the first book was published, I have been an ardent follower of the series of books, Dune, written by Frank Herbert. If ever there was a tale worthy of the talents of famed New Zealand director, Peter Jackson, then this is it – once he wraps up the latest Hobbit trilogy, perhaps he will turn his hand to Dune! Travelling across the galaxies? No problem, let’s just fold space! Lost a valued colleague? Not a problem either, just order up a clone! Central to the story line is spice – a very special spice that’s required to sustain the Mentat, “human computers”, as well as the Spacing Guild “navigators”.

For those not familiar with the story line of Dune, spice is the key that unlocks the universe as well as holds it captive; it’s at the very center of intergalactic intrigue and is only found on one sandy planet, Dune. Turn a page in any of the books, and there’s barely a paragraph without some reference to the planet and the treasures lying hidden beneath its sandy dunes. So it is too today, as we look at the work being done by the world’s data centers – highly valued business gems lying buried under data dunes.

If we were in the midst of a general election involving all in IT then it would be easy to sum up the issues of the day under the general declaration – it’s the data, stupid! So much of what’s driving technology is in response to the unrelenting upward escalation in the volumes of data that are being generated. No longer can we afford to simply throw data away. It’s scary to think of just how much useful business information fell through the cracks in former times just as it’s equally as scary to think of what we may learn if we pull into our daily operational lives even more data currently accumulating outside of our sphere of operations.  

“A number of years ago while running extremely high volumes of airline shopping queries through NonStop being monitored by Prognosis from IR; we were literally dumping valuable log data in the bit bucket. Locked within the log data was what the traveling public was asking for, what they were shown, and what they bought with regards to air travel,” observed former Sabre IT executive (and former ITUG Chairman), Scott Healy. “This stream of data, well actually more like a raging torrent of data, could have been analyzed and used to create actionable, real time business intelligence for airlines.” 

Data is attracting the spot light and dominating the stage of almost every technology event being promoted these days, and for good cause. What we see today is a greater awareness that answers lie in what we already process – connect the dots, dig deep into historical data and what others are saying, and a more complete picture can be painted. As Healy was only too keen to add, “What would have been interesting would have been to show the airline customers insights gleaned from a day of the log data, or perhaps a few days.  Find out what they were interested in (vis-√†-vis willing to pay for) and what data should be summarized and stored for future trending analysis.”

A fall-out from recent tragic international events involving aircraft exposed just how much machine to machine communications were already happening. The network of sensors communicating operational information around the clock surprised many unfamiliar with the subject and yet, we have only just scratched the surface. Modern automobiles are producing enormous amounts of data as are the buildings and factories around us. Infrastructure from power grids to the distribution and subsequent treatment of water to the stocking of retailers’ shelves, it all involves data being generated and passed over networks that for the most part, are public.

While storage manufacturers are happy to see this unrelenting upward escalation in the volume of data being generated, there really are limits to just how much data any one enterprise ends up retaining. Policies are already in place that limit the amount of data retained to just a day, or a week, or even a couple of months. However, when it comes to national security, medical research and even public records, arguments can be made to keep every bit of data that passes through the processing environment. After all, forensic mining of data for greater insight into trends has become an industry in its own right.

For the NonStop community this is proving more than a fad. It’s vitally important to all parties that better insight is realized, whether it’s insight about the business, about the market, or even the IT operations itself, acting insightfully is not just an innovative move but rather mandatory one for most enterprises and it’s only with access to data – all the data – that this degree of insight can be achieved. In my most recent discussions with vendors this isn’t being overlooked and has become the subject of initiatives aimed at better addressing this requirement.

Yes, it’s the data, stupid! So no, it’s not acceptable to “log data in the bit bucket”. However, it is equally not acceptable to blindly store everything without due consideration given to the value. This is something I touched on in my past post when I noted how, for many enterprises where NonStop systems are relied upon, the trend is towards “capture, analyze, store”,  with mechanisms in place that allow pertinent data analyzed and even stored to be reused by processes at the time additional data is being captured.

I hadn’t yet joined Tandem Computers when the news began to percolate through the industry that Tandem was working on a permanently available implementation of SQL. Overseen by the late, great, Dr. Jim Gray, NS SQL challenged many traditional beliefs about SQL implementations – it’s mixed workload capabilities that allow NS SQL to keep on processing even as DBAs run all-important statistics (needed to keep SQL “in tune) or, as I reported a few years back with respect to one large NonStop user, “maintenance? Truly, we run reorgs, statistics, splits, column adds, etc. all without taking anything down. It’s the NonStop fundamentals!”

NS SQL did exploit the fundamentals but all those years ago, Gray was aware that IT dynamics were evolving and that there would be even more data to process, and that transactions themselves would be even more data intensive. Indeed, after the passing of Gray, Microsoft announced that they would create the eScience Award for the researcher “who has made an outstanding contribution to the field of data-intensive computing”.

Furthermore, according to Microsoft, “In a lecture he delivered just 17 days before he went missing, Jim outlined the increasingly important challenge and opportunity afforded by the availability of previously unimaginable volumes of data and continuous research dedicated to creating new understanding of the world around us.” In other words, even the acknowledged father of NS SQL recognized that ahead of us would be an unrelenting upward escalation in the volume of data and, with these unimaginable volumes of data, the opportunity to provide unimaginable insight into all that transpires in an enterprise.

“Augmenting a transaction and giving it access to additional information for greater insight,” WebAction EVP, Sami Akbay, advised in the last post of last month, I need data I can digest, in small bites, please! before adding, “seems to be a reasonable request and one we are taking seriously.” I am referencing WebAction more often these days as I pursue the story of data and it’s no coincidence that the executive team is made up of former GoldenGate executives. Nor is it a coincidence that they see the need for greater integration of data from disparate sources being a logical next step following data replication and data integration. Like Gray, WebAction appreciates that to fully understand the business world around them, the importance of data cannot be understated.

Mobility, IoT and M2M exchanges, Big Data and Clouds are all involved in data Рgenerating data, processing data and storing data Рand it should come as no surprise to anyone in the NonStop community to read of the innovative solutions under development that continue to include NonStop systems. IR, comForte, WebAction and OmniPayments are just a few of the vendors I know first-hand are involved in some aspect of data-intensive computing. We can take our eyes off numerous initiatives and not be hurt too badly but if we become blasé about the importance of data, we run the risk of being blind-sided by our competitors.

Yes, it’s most definitely all about the data, stupid – so pay attention, as there’s plenty of data suggesting this is a development not to be ignored! The picture used (above) wasn't taken at some desolate spot on the earth but rather, nearby; tucked under the shadow of the Rocky Mountains about 200 miles from Boulder and a must-see for even the most casual tourist. And who knows what treasure lies beneath those sandy dunes. And likewise, there are gems aplenty buried within those monstrous data dunes rising from the floors of today’s data centers!