Monday, October 20, 2014

The value of partnerships – NonStop VAR program?

As the price of gas at the pump continues to drop in the US, it’s hard to miss the flow-on effect to the community. Could we be getting value for our money at last? When the topic is increasing the sales reach of NonStop value too is important. Could we see the return of VARs?

Have you all seen what’s been happening with the price of gas at the pumps here in America? Is the same thing happening everywhere? In the US we have seen the price of gas drop to $2.89 per US gallon (3.8 liters) as depicted above where I just filled up the Jeep. Remarkable! Considering our family has nothing but vehicles with large displacement engines, this is proving to be a blessing for us and it feels like getting an unexpected tax break. With the US now awash in sea of oil, thanks to the wonders of fracking, and the market laws of supply and demand kicking in, the price at the pump is likely to go down even further – the days of $2.00 per gallon gas may just be coming back. For now, $2.89 for gas? Remarkable!

Whenever I read about discussions on value and on value-add, I can’t help but wonder whatever happened to the Value Added Resellers (VARs) that so many of us depended upon in the past. My fondest memories of the early days of Tandem were about the wonderful Alliance program Tandem had assembled and among the list of Alliance partners, there were many VARs. The specialization VARs brought with them – whether it was a specific market, such as travel, or a regions such as Kenya or Columbia – made the process of selling systems much easier. Perhaps the days of VARs are over given how commoditization has brought with it considerable easing of prices. And yet, this may not be completely true when it comes to NonStop, or so it would appear.

Listening to presentations given these past few weeks at both MATUG and CTUG it’s very clear that with the arrival of a family of NonStop systems utilizing the Intel x86 architecture, the plan will be to pursue new markets with new partners rather than simply flipping the base one more time. It is completely understandable that HP doesn’t want to convert a $1million business into a $100thousand business – the most basic of business schools frown on such approaches. That isn’t to say that there will be some existing customers who will provide fertile ground for these new x86 systems, but that will likely be because the new systems will open new doors for new applications, complementing the present Itanium systems, but I expect that to be the exception. No, HP wants to add to the NonStop business, not just pursue yet another one-for-one migration.

What was also obvious from the presentations to the user groups is that HP wouldn’t be all that keen for the sales teams to add a couple of hundred new salesmen to address these new markets. Solutions sell systems today and that’s very much a given, even for the new HP we face, Hewlett-Packard Enterprise. So why not engage the solutions companies more closely and reopen the doors for VAR sales of their solutions with NonStop systems?  While nothing that could be viewed as a commitment was given by any of the HP personnel involved in the events, nevertheless to anyone who thought through the mechanics of adding a complementary marketing program it seemed a pretty obvious conclusion. VARs selling NonStop? Indeed, every bit as remarkable as $2.89 gas I suspect!

In talking with NonStop partners the reaction was positive overall. According to the CEO of the OmniPayments, Inc., Yash Kapadia, “Yes we can be VARs and we have discussed being a VAR with HP many times.” But Yash was also quick to note that there would need to be a lot more discussion this time around as in those earlier discussions, according to Yash, “At one time they wanted me to take responsibility to generate more sales than all of EMEA !!!!” Why doesn’t this surprise me? The NonStop sales team that we have working globally is a lot different from what we have had in the past and there’s some level heads watching over NonStop today. If a discussion develops around the subject of VARs the way I would like to see if happen then I think it will be a case of introductory baby-steps in the early stages.

In talking to middleware vendor, IR, there was support for VARs as well although not quite as exuberant as solutions vendors and yet, putting the topic under a different light. In an email exchange with America’s Sales Director, Jay Horton, he began, as I expected, with how he, “Wouldn’t see our large users moving away from a direct relationship with us, (certainly not in the near term anyway) but with the presence of VARs, it would open the door for IR to scale and pick up business we are not able to chase currently.” Horton then added that, “We don’t have resources with knowledge of NonStop to chase a lot of these new markets; however, an HP NonStop VAR channel would allow us to scale. It would require us to put a more dedicated NonStop channel support model in place.”

Andre Cuenin, President Americas & Europe at IR Inc. did remind me that a NonStop Distribution / VAR model is in place in geographies apart from the US. For example, in EMEA and AP-J many markets are being served through local NSK distributors / VARs. “In these countries, these partners  typically sell Prognosis. In the US market, most of our NonStop customers are traditionally, direct customers. However, with our Payments partnership with ACI, we do see a shift to more channel revenue in the US as well.

Could a middleware vendor like IR become an HP NonStop VAR? I asked this more out of curiosity than anything else as I anticipate that this would be of interest to solutions vendors and Yash had already expressed that, should a VAR program be launched globally, he would be more than interested. From an IR perspective they do not see themselves as becoming a NonStop reseller, however Cuenin mentioned that, “working with HP and its respective channels to align its go-to-market model is the strategy”. Horton shared that “we have explored bundling a Prognosis Lite version with platform vendors and NonStop on x86 seems a good target for this approach.”

For some time now I have thought that comprehensive monitoring solutions were butting right up against the line that separates middleware from solutions and with just a little more capability, monitoring solutions would be able to stand on their own merits. Should IR pursue this aggressively, and for a reasonable value-add pricing, then I cannot see why they wouldn’t reconsider becoming a VAR for NonStop, and possibly, in conjunction with offerings from other middleware vendors. Had you purchased a laptop from BestBuy you would notice that it comes with a lot of pre-loaded software offered on trial bases – could this become the norm for NonStop sales?

When I put this prospect to a NonStop manager just recently the response was expectedly cautious. “Just last week, we had a customer ask about becoming a VAR, selling both hardware and their application.” However he also acknowledged that for him, he wasn’t sure “if one request is showing a trend, but I think the x86 does open the eyes of existing and new VARs.” And that’s about as far as the discussion went on the topic of a VAR initiative today – but I’m on the edge here and think this could go either way. Complementing classic sales with a thriving VAR program? Again, that would be remarkable.

The key word in VAR is value – a VAR program would have to provide value for the target marketplace. Customers would need to realize savings they otherwise wouldn’t expect to see – or leverage the partnership in a more competitive manner than their rivals. Companies today see value as having many shades and what one customer views as dross another may prize highly and it would be up to the VAR to clearly demonstrate the value proposition. While I have only talked to a very small selection of NonStop vendors, I know there are more that may be toying with the idea, so it will be interesting to watch whether a much broader program that included the US eventuates and whether the participants truly bring with them the value we all would expect.

$2.89 gas! Less expensive NonStop systems? New partners and new markets! A thriving VAR program … there’s still time for something to develop but from where I sit and following the conversations I have had, I am hoping that this time, it will be a positive outcome. Selling more NonStop systems addressing more markets to even more companies is what we all would like to see and while a thriving VAR program is not an automatic answer to some of the most pressing questions about how to get more NonStop systems sold, it has merit and I truly hope for the best this time around.

Monday, October 6, 2014

Finding our way …

While driving into unfamiliar surroundings, a GPS is handy – perhaps, two of them, to make navigation easier. For HP, in entering new waters, splitting in two should allow the company to better navigate in two completely different marketplaces!

This weekend it was a case of a quick drive along the Hudson River, the “river that defined America”.  As it is still fall and the leaves continue to change color, it was one of the drives we had always wanted to do and the color show didn’t disappoint. Final destination Sunday night was Montréal where we will be spending a little time on business before enjoying more of what Montréal has to offer. As you can see above, the old Montréal Port and grain processing plant look great in the autumn sunshine!

While the route up the Hudson was straight forward, finding our hotel in the center of town proved challenging. Given that we were later to find out that there were several streets with the same name, our first attempt to locate the hotel via our vehicles GPS led us to an industrial site some 15 miles from the city center. Being typical NonStop aficionados, we pulled out our backup tablet and used it’s GPS and once we shut down the stream of instructions that continued to emanate from the first GPS we threaded our way through city construction zones and found our way to the hotel.

This week, it will be CTUG that is very much on my mind. It was only a week ago that we attended MATUG, held in a hotel just outside Philadelphia, a circumstance I commented on briefly in the last post. NonStop has always been defined by its community and so there was a strong showing by HP and the vendor community with a sprinkling of customers thrown in for good measure. NonStop has always enjoyed energetic supporters, of course, but looking ahead to CTUG, I am hoping for a little larger turn-out of users.

The good news for those tracking HP NonStop developments of support for NonStop in the Intel x86 architecture is that it all looks to be going well with HP looking for even greater community participation in the Beta program.  While I can only speak on behalf of a handful of vendors who have tested their products on x86, there’s only been a few minor hiccups to report all of which were quickly addressed by NonStop development. If I were to speculate, I suspect there will be a couple of users nearing production rollout of their applications on x86 in the new year – very impressive indeed!

However, even as I was thinking about this post over the weekend, there was very little news coming from the financial and business press to highlight what has so quickly transpired over the past 48 hours or so. Yes, we are going to have two HPs now, not one. The much heralded split between consumer and enterprise customers is going to happen and we will be seeing the formation of two independently operating companies. Certainly it is going to liven up the agenda for HP Discover (Europe) that will take place in Barcelona, December 2-4, and I sure wish that I could find a way to participate.

I have kicked off a discussion on the LinkedIn group, Fools for NonStop, and if you are a member of this group you will see that comments have already started to appear. It is true that splitting the company was rumored as being on the agenda of the former CEO, Leo Apotheker, but as soon as Meg Whitman took over, she went to great pains to downplay such a move by HP. However, as this quarter’s results were published and as Whitman responded to the financial community, no matter how good the latest news appeared, HP stock took another hit.

According to a post to the site, Seeking alpha, one analyst wrote of how, “The company is going to be splitting into two divisions, putting more action into practice as it continues its restructuring and comeback under CEO Whitman.” Furthermore, said the analyst, “After the street didn't seem to like the strides the company had made last quarter, it eventually came to its senses. HPQ has dipped over the last month, however, shedding nearly 10% of its value.”  It’s a tough crowd for sure, but in the end, few other options remained for Whitman and HP after focusing on profitability and operational costs. The company is “huge”, as I noted in the LinkedIn discussion, and generally, Wall Street prefers companies tightly focused rather than the sprawling portfolio conglomerates among which HP was now counted.

The specifics? As the news officially broke this morning, it was not surprising to read in the official HP Press Release that “HP has announced its plans to separate into two new publicly traded companies: one comprising of HP's enterprise technology infrastructure, software and services businesses, to be called Hewlett-Packard Enterprise, and HP Inc., which will comprise of personal systems and printer ops; Meg Whitman will become President and CEO of Hewlett-Packard Enterprise; Dion Weisler will become President and CEO of HP Inc.; Immediately following the transaction, which is expected to be completed by the end of fiscal 2015, HP shareholders will own shares of both Hewlett-Packard Enterprise and HP Inc.”

I have always taken with a liberal number of grains of salt that there was synergy between a consumer, PC company, and a business, enterprise-system company, but I was prepared to let that slip on by for the sake of better story telling. There were so many ways to write about value that a company as big as HP provided but it was getting harder and harder to keep a straight face. Yes, the addition of support for x86 brought some synergy but anyone who has looked at the Intel roadmap for Xeon knows all too well that there’s a swag of different Xeon chips covered by the roadmap and to the naked eye, it was easy to get lost among the numerous pages that comprise the roadmap – different charts for consumers, small business and the enterprise.

Appointing Weisler as the new President and CEO of HP Inc. looks promising, as I’m always pleased to read of any Australian doing well in corporate America, but he will certainly face a challenging job – Lenovo overtook HP just recently and if Australians are just one thing, it’s that they are furiously competitive in all that they do – apart from our Kiwi brethren across the Tasman, I know of no other nation as competitively minded as Australia, including America. But there’s a whole lot more here than just freeing up the PC business to compete, as there is a real need to sew together a competitive mobile offering and even though I thought the Palm deal was a solid beginning, now I am not so sure what HP Inc. will be doing to develop a presence in this part of the market.

Across the aisle, Hewlett-Packard Enterprise lacks a couple of items – a unifying database technology (IBM, Microsoft, and Oracle all have strong database plays), a unique OS (apart from NonStop, of course), and a virtual platform. And this raises the question; should EMC spin off VMWare, should HP buy VMWare? With all that I am watching in the telco space I think a very good case could be made for such an eventuality, particularly as talks to acquire all of EMC apparently broke down a short time ago. But again, even if such a deal transpired, what of NonStop?

For the last couple of years – in my Wishes for NonStop posts to this blog, as well as within my more recent user event presentations – I have openly discussed the morphing of NonStop into something quite different to what we see today. While I have stressed that should you check all the right boxes, you will still be able to order a NonStop system, more than likely, the future will include NonStop as a set of features, services and libraries atop a next generation Linux along the lines of what HP CTO, Martin Fink, has been referencing of late as part of his vision for The Machine. I have no real evidence in support of this other than my own observations so it may prove very interesting times for NonStop development during the next couple of years.

HP does need a course correction. It does need to find its way again and perhaps a complete turnaround is required. In the HP Press Release, Whitman did say, “The decision to separate into two market-leading companies underscores our commitment to the turnaround plan. It will provide each new company with the independence, focus, financial resources, and flexibility they need to adapt quickly to market and customer dynamics, while generating long-term value for shareholders. In short, by transitioning now from one HP to two new companies, created out of our successful turnaround efforts, we will be in an even better position to compete in the market, support our customers and partners, and deliver maximum value to our shareholders.”

CTUG now beckons and I am sure that this latest news from HP will generate numerous discussions. All we need to do is figure out how to depart 
Montréal for the drive south, along the St Lawrence Seaway. I’m pretty sure I will be able to find my way to Mississauga with the help of my GPS aids, but nothing is ever certain. Like HP, hoping for renewed market interest with a turnaround that involves doing the splits, I too am hoping I can make it without any further course corrections or U-turns. Looking forward to seeing many of you shortly, at CTUG, and I am certain that with this latest announcement from HP, there will be many discussions to follow! 

Monday, September 22, 2014

Those Special Occasions!

I’m posting this on the road as I drive to MATUG in Philadelphia, PA. However, before we packed the bags and began the drive, we had an opportunity to celebrate with friends and when it comes to NonStop, I was reminded of earlier posts and yes, I do get it!

Had a good friends, Brian and Jan, over for the weekend; a mix of business and good meals with fine wine. In this case the friend, Brian, is someone with whom I can share my business plans as well as my interests in cars – yes, when asked where he was by a former neighbor, Brian simply responded “I’m with the Buckles, driving Corvettes and drinking good red wine”. As it turned out, he was here for his birthday as well, and on a special occasion like this, it called for something equally as special from the wine cellar.

As I pulled the cork from a 1993 Silver Oak – a great California Cabernet Sauvignon – I have to admit, I was apprehensive. I was real careful removing the cork as it had dried, and once I had it out of the bottle I decanted the wine, just to be sure. Left to breathe for an hour, perhaps longer, it proved to be everything we were hoping it would be – not a single rough edge on this drop of red as we tentatively took a first sip. It was suggested that the wine could be kept 15+ years but I was really surprised as we had laid it down for 20+ years and it tasted a lot better than when first we opened a bottle, back in the mid-1990s.

I couldn’t help noticing that in a post of nearly two years ago, back on December 26, 2012, I had featured another good bottle of wine that I had opened. The occasion that time had been the Christmas holidays, and again it had been a good bottle of Cabernet Sauvignon - 1994 South Australian bottle from Wynns.
The post, Yes, I get it!

In addition to wine review featured comments from a race car driver who had observed how the team had tried to “Control costs, restrict options, standardize on certain parts, increase production, amortize investment over a wider base, increase value for money” and at the time, I suggested we could say much the same thing about NonStop.

Continued investment in NonStop is a very delicate balancing act – there’s a need for new features even as there’s a need to provide an integrated hardware and software product for a fair and reasonable price. The viability of NonStop is still very much tied to it being cost effective when viewed against the competition, even if we all agree that NonStop has very little by way of competition. It was analysts at IDC who came up with an “availability spectrum”, categorizing all systems as either Availability Level 1, 2, 3, or 4, with AL1 referring to systems not shipped as highly available and AL4 referring to fault-tolerant servers.

IDC describes AL4 as being “the highest-availability level, connoting that the end-user experiences no perceived interruption based on the use of fault tolerant servers. In this level, the combination of multiple hardware and software components allows a near-instantaneous failover to alternate hardware / software resources so that business processing continues as before without interruption.” Without delving into the finer points of takeover versus failover, I think readers will appreciate the simplicity of this definition, especially as it comes at the “problem” from an end-user experience perspective.

Control costs! Standardize on certain parts! Increase value for money! It’s hard to argue against the motivation or drive behind these goals. For readers who have regularly attended user events anywhere in the world where NonStop product management has been present these goals should be well-known by now. I can’t recall a presentation by a manager or executive who hasn’t pointed to a roadmap slide and hammered home the continuing pursuit by all within the HP NonStop team to increase the value for money.

The argument in favor of going with MIPS was all based around the unsustainable business model of continuing with custom chips. The subsequent move to Itanium followed a similar line of reasoning, this time the thought being that Intel had far deeper R&D pockets than MIPS (then a part of SGI between 1992 and 1998). However, even as Itanium continues to underpin modern NonStop servers, increasing value for money over time meant that an even more popular chip would be required – something that was not just a standard but deployed broadly enough so as to “amortize investment over a wider base”.

The move to support the Intel x86 architecture certainly is a positive move in this direction, and having a portfolio of products all utilizing x86, economies of scale will certainly be present. But there’s more to this story and as I wrote two years ago (long before NonStop announced plans to support x86), I do get it!

It’s not that NonStop development is hedging on either performance goals or pricing, as I am sure we will hear a lot more about all of this at the NonStop Bootcamp in November, but I have to believe the x86 will result in performance improvement for some solutions, if not all, even as I have to believe the price will be more attractive. When you look at Intel’s roadmap for the x86 you quickly realize that the x86 is a substantial family of chips – Xeon comes in many flavors, if you like, with some Xeon chips focused on client-side processing whereas others are focused on the server side.

Good performance, value pricing, and of course, a whole new fabric interconnect that holds the promise of even faster processing speeds combined with the possibility of building hybrid computers made up of different systems (from an OS perspective) and again, all based on standards. Certain parts just have to be standardized these days – the chips, the memory, the interconnect fabric and NonStop is certainly delivering on this promise. As middleware and solutions vendors come to terms with the new technology, the prospect for even greater optimization is apparent and even though there may be changes coming, I get that, too!

When x86 rolls out, there’s no doubting the special occasion it will represent. In some quarters, it will warrant a bottle or two of champagne being opened. And from my perspective, a well-deserved celebration, for sure! However, when it comes to controlling costs, this is where the NonStop users will share an equal burden. There’s much that they can do to control costs and much of that has to do with the cost of human resources – in achieving AL4, NonStop development has provided an integrated hardware and software “stack” and the higher up the stack NonStop users elect to go, the more they can control the costs.

This too includes those vendors who are now in the midst of porting new applications to NonStop. If existing NonStop users or new users to NonStop elect to skip the middleware available today, they make leveraging the benefits of an AL4 system difficult. It may be viewed as clever, or even ingenious, to dump a new product on NonStop with minimal touch points to the stack NonStop development provides, but I am sure five or ten years out, with the development lead no longer present, it will look little different to instances of the same solution on other systems. If you are going to run it on NonStop then take advantage of NonStop top-to-bottom, otherwise little by way of controlling costs will materialize. System savings will pale in comparison to the human costs involved.

Increase value for money! Adding support for the Intal x86 is going to be a big help in this department. Standardize on certain parts! Embracing an interconnect technology as NonStop development proposed via InfinBand (IB) as well as manufacturing producing an universal blade as a result is also going to be much appreciated. Control costs! Leveraging NonStop development expertise, fully utilizing the integrated stack and letting the OS ride the changes well is our cue for NonStop users to do the smart thing – it’s almost as if we are learning to love Pathway all over again!

The bottom line here, of course, is that after four decades (or thereabouts) we are still talking about NonStop and still thinking up new ways to exploit the best implementation of AL4 available today, Sure, IBM mainframes today now have what IBM is calling Geographically Dispersed Parallel Sysplex (GDPS) that is an extension of Parallel Sysplex in order to support mainframes located, potentially, in different cities. But this is at what cost? And in the end, with what added value?

Complexity abounds with this model of IBMs and yet, it’s so simple for NonStop users. Can we execute a takeover where systems are separated? NonStop users familiar with the recent additions to Pathway (TS/MP) 2.4 and later, understand how easy it is to deploy NonStop in this manner with far less complexity and at a lot lower price point – TCO remains important as do the number of nines, and even though IDC gives the IBM system z mainframe running in Parallel Sysplex mode an equivalent AL4 status to NonStop, it’s at such a high price point that no, I just don’t get it!

I still have some good wine left in the cellar and I am sure that my friends, Brian and Jan, will find a way back for more. There will be more pictures taken of empty bottles, too, to be featured in stories to come I am certain of that. However, when the new NonStop systems begin to ship I too will make sure there’s a little champagne on the side that I will be only too happy to open and to toast the NonStop development team for a job that is sure to be well done. NonStop on x86? NonStop in IB? NonStop in hybrids and even Converged Systems well, what else can I say but that I get it, too!

Tuesday, September 16, 2014

For NonStop users, moving data isn’t distracting!

Moving files comprised of punched cards or indeed, magnetic tape, was so 20th century and yet, when it comes to even the most advanced transaction processing systems, files still need to be moved and often this is the specialty of a select few boutique businesses.  


Out on America’s interstate highways you see transportation evolving on an almost continuous basis. With the regular trips we make to the west coast, whether it is to HP in Palo Alto or my other clients in the greater Los Angeles area, we have seen extended sleeper compartments behind the cabs of big rigs simply getting bigger and we have seen much greater use of aerodynamics on nearly every truck we pass. However, on these trips we have also seen the venerable dump, or tip, truck lose ground to massive side dump trucks – often hauled in tandem behind a single tractor or prime mover. With highway maintenance resulting in huge tracts of roadway being torn up, it’s no surprise to see these monsters at work. 

Forced to stop and stand idle on the side of the highway as they go about their work, it’s nothing more than an untimely distraction and one we dread facing each trip out west. The key advantages of the side dump, however, is that it allows rapid unloading and they are almost immune to being upset (tipping over) while dumping, unlike the traditional end dump trucks. More obvious, even to the untrained eye, is that they can simply transport a bigger load than the old-style, end dump vehicles. Furthermore, when they do transfer their contents, it happens a lot faster as the sides, naturally enough, are a lot wider than the ends. I know, I have had plenty of time to watch them.

Whenever we talk about such topics – big loads, greater weight, faster transfer it’s as if the conversation has taken off in a different direction. For data center managers everywhere, it’s all about the data and moving data, storing data, and not let’s not forget to mention, the running of analytics against the data. No longer a case of simply picking up a tray of punched cards and upending them into the card reader, as was once the case (and a task that was taken away from me at one point in my career), but across every channel connected to a computer, voluminous amounts of data flow.

As I watch the beginning of the Internet of Things (IoT) era where almost every mechanical device known to man will be connected to the internet (along with every conceivable contraption being turned into an effective measuring instrument), the prospect of even more data needing to be moved is inescapable. Unfortunately, among the NonStop community, such movement of data has been associated with batch processing, something NonStop applications treat with disdain and yet, much of the data NonStop transactions use either as raw material or as finished product is of value to other parts of the business. And it has to be moved.

As someone who is passionate about cars, a special case of IoT that caught my eye was the recent news that American legislators were catching on to the potential of Vehicle to Vehicle (V2V) communication. Whether it’s vehicle manufacturers – Volvo promising that by 2020 it will provide crash-proof cars (a step up from an earlier initiative to provide injury-proof cars by the same year) – for insurance companies or law enforcement, cars that communicate with each other represent a whole new specter of monitoring even as it opens yet another chapter on data movement.

Streams of data will be shared among these with vested interests, leading to a whole raft of new applications. In a recent post to realtime.ir.com, For NonStop there’s no downside to monitoring unidentified moving objects I connected the dots between V2V, the movement of data, and the analytics that will be produced and it wasn’t hard to miss the impact that would be made. The real holy grail of business insight, I suggested in that post to the IR blog, has always been determining behavior so as to better focus critical business resources on the closing of sales opportunities. In other words, interrupt the flow of data and view incomplete scenarios and the insight derived will be less than meaningful even if the applications are brand new!

When you think about it, nowhere would this behavior determination add more value than when applied to driving a car.  Furthermore, if we don’t elect to ban driverless cars outright then V2V is inevitable – younger generations of drivers have become too distracted these days. So serendipitous that after all these years, data center consoles and dashboards may shortly be a collage of real dashboards and the real time monitoring familiar to every data center operator will reflect more closely a world that gave rise to much of the jargon that’s used within the data center – system crashes, scratch-files, and (data) collisions  included!

IoT, V2V and even M2M, which has been with us way longer than many of us care to acknowledge, are responsible not only for new opportunities for vendors like IR with Prognosis, but also for the greater movement of data we see today. But moving data has been going on for years, NonStop systems included, despite any apparent disdain for the process. All too often when we discuss solutions running on NonStop and quickly delve into the middleware deployed, our attention gravitates to the transaction processing components. However, for these solutions to participate in the world at large, assumptions are made and empty “boxes” included in flowcharts (to be filled in later) that convey a rather false sense of “she’ll be right, mate!” Files will be moved somehow and we will get there, on the day. Yet keeping that data moving is every bit as important as any consumption of data from any client device.

Contrary to what we may have read in a recent post by Mark Hurd, this is not simply a case of making sure you get everything from just one vendor, Oracle preferably. If you missed my opinion on this subject, check out my most recent post to the blog at WebAction, Ain’t no bugs on me … and yes, I have been waiting a long time to get this jingle into a business blog post. The NonStop team has recognized that they alone will not be able to provide everything the user community may require and this has led to there being there a very strong vendor community well-versed in what users require and when it comes to moving files, this is especially the case.

I covered this recently in a private communication to a major client where I referenced perhaps one of the least talked about products on NonStop – DataExpress. DataExpress has been in the business of moving files for several decades and it has done an effective job for some of the biggest Financial Institutions (FI) on the planet. For a number of them, simply having deployed DataExpress is a market differentiator all by itself. As Michelle Marost, President of DataExpress, sees things, “Our clients know that moving data securely and efficiently is critical to their business, and have trusted DataExpress to manage the process for them.”

In case you think the realm of data movement is something akin to upending boxes of punched cards into a reader, think again. Big Data – well, there’s a lot of data that has to be moved to maximize the effectiveness of Big Data frameworks and the analytics they feed. Clouds – well, more than anything else, there’s lots of data that needs to be securely moved in and out of cloud resources. According to DataExpress’s Marost, “Anyone can move bits and bytes between business units, customers and machines, but have you asked yourself if your business, your relationships and your reputation could survive intact should the integrity of that data be compromised.”

Whether it’s open cut mines, excavating a site for a new high-rise building, or simply freeway construction, moving the dirt is a costly exercise so the quicker it can be moved the better. Not only is it costly in dollar terms, but also in time where smaller loads push out completion dates. Much the same can be said about data, of course. However, there’s a lot more to the story – moving data as not just an adjunct to the main process, it is the main process for some applications. Miss getting all the data to a government agency on time, and penalties will follow.

Finally, Marost reflected, “We see more interest in secure file transfer, not less, and we have a growing pipeline of features that will prove even more valuable for our HP customers down the road!” Clouds? Big Data? Social Media? Email? Yes, this growing pipeline of features from DataExpress embraces them all and in the end, will likely mitigate the disdain many of us may feel about the task of moving data. However, in the highly charged world of always-on, it’s imperative for all parties to have access to timely and complete data and this will always involve moving data. Keeping those side dump trucks moving may not be such a distraction after all!    

Monday, September 8, 2014

Yet another anniversary … yet another post! And NonStop holds firm …

I am still at it and the posts keep on coming. No surprises here but the readership continues to grow and fair enough, the message on NonStop is attracting an even bigger audience. Yes, it’s all happening and shortly, user events will dominate NonStop community agendas worldwide!

I had to remind myself not to forget writing this post; after all, remembering anniversaries remains an important consideration in all we do. Shortly, I will be returning to the race track for one more time before summer ends and there’s no escaping that, with the coming of autumn, winter is only a few weeks away. Already fall colors are in evidence everywhere; the neighborhood pools are being closed, gardens are undergoing their annual clean-up and trim, and inside garages battery tenders are being readied for use. 

This year I will skip a lengthy introduction and just come out with it – seven years of posting to this NonStop community blog are now behind me as of last month and now it’s time to look ahead at a new year of posts. Based on feedback from you I now post three, occasionally four posts a month with each one more or less a feature article. I learnt very early on that posting almost daily wasn’t something many of you had time to read, even if the posts were only 800 words or so and there was nothing technical involving the merits of some obscure programming trick or a feature of a programming language.

If you have missed an earlier anniversary posts, I have now set up a label -
Anniversary Post Follow this link and you will find all previous posts. Furthermore, and just as a reminder, there’s a label set up that takes you to all previous posts on my wishes for NonStop – posts I write every three years and if you are interested in knowing how these wishes have involved over time and have missed a couple of them,  take a look at this label - Wishes However, what I welcome most are the comments that are posted and also there are discussions in many LinkedIn groups, I continue to encourage readers to look at the many comments posted to the LinkedIn group of the same name, Real Time View.

Of course, there are folks within HP too who continue to encourage an ongoing presence in social media – it’s an inexpensive way to communicate passion for the product and to engage more directly with the community. Independent blogs with an arms-distance relationship to a primary vendor are among the best read blogs and in many ways, have relegated old-style newsletters from even the most respected thought-leaders to just historical footnotes. The immediacy of posts is appreciated by all members of a community and this is understood by many within HP. “I see tremendous value from independent bloggers providing commentary on HP and NonStop,” said Gary Allen, Senior Manager, HPS Marketing Programs. “Social networking is of huge value and doing so independently of HP, especially valuable as readers of your blog always anticipate a perspective that reflects your history and experience.”

Building a community around NonStop requires many things to happen and in the past, this mostly involved user gatherings. ITUG was once all that the community talked about – indeed, when I first joined Tandem Computers it was the very existence of ITUG that helped me decide to join Tandem. Working at the time on the east coast, colleagues returning from an ITUG event in New Orleans couldn’t stay quiet about all that had happened there. However, social media has pretty much plugged the hole that was once the task of big tent user–run events.

It’s not as if we no longer like to network, but the reality is that few of us have budgets that cover the cost of an annual pilgrimage to San Jose. In all likelihood, few companies running NonStop today even have the staff on hand sufficiently populous to allow a few to disappear for a week. In talking to HP, at one point the conversation turned to the matter of there no longer being a “bench” of technical staff trained and experienced in NonStop to throw at new projects – having tiers made up of senior managers, technicians and junior staff has evaporated leaving data centers staffed by just a few system administrators casting an occasional glimpse at a console display. Yet, surely we can afford time to get together for the shorter, regional, gatherings and we sure strive to find a way to get to the San Jose bootcamp! I hope to see many of you at those events through the year!

Recently, in a discussion on LinkedIn, someone asked the NonStop community whether “Tandem is any longer a renowned server” and “who is going to appreciate it”? Furthermore, from the same individual, “for Java developers working in Tandem it’s a hard task as they can’t bring any new things to it!” Now, I am not clear as to the maturity of this individual or just how experienced he is with working on the latest NonStop systems but I am sure there will be others within the NonStop community who will step in here and provide additional insight – but bottom line, if it’s the latest iteration of the NonStop stack, it’s not that hard to port Java applications these days. “Java rocks!” is still the catch-cry of one well- respected NonStop architect in the user community.

However, the question aside, the more important consideration here is that the issue was even raised in the first place. Social media may not be everyone’s cup of tea but for those within the NonStop community scattered as they are to the four corners of the planet, social media is doing a fine job providing us all with a sense of community. If you aren’t all that certain, just take a short time out and check how many active discussion groups and chat rooms are there – from LinkedIn to Yahoo and Google – all devoted to helping out NonStop developers whenever they experience difficulties.

How many within the NonStop community would have thought there would be separate LinkedIn discussion groups for them involving topics like Clouds and Big Data – if you missed it, this was the central theme of last week’s post,
A time to put the hammer down! NonStop accelerating adoption of Clouds and Big Data … If, as yet, you haven’t read that post, it’s well worth the time spent. Again, just the mere presence of such groups sends out an important message – there’s plenty worth discussing on topics like these and there’s more than just an individual or two looking at ways to leverage such key transforming technologies. For me, the presence of as many LinkedIn groups as there are that include NonStop in their group name is more than encouraging – and the NonStop community has to be pleased with the evolution of some of these groups.

This is the start of my eight year of blogging and as I look back, there’s been a couple of common themes but no matter how you look at them, these themes do center on why we aren’t doing more to promote NonStop and why aren’t there more solutions on NonStop and, in a related fashion, why isn’t the rest of the industry as proud of NonStop as we all are – the IDC and Gartner, the InfoWorld and CIO publications. After four decades, why aren’t informed CIOs more appreciative of the fault tolerant technology inherent in the integrated hardware / software stack that is today the modern NonStop system?

In part, it still is up to all of us to become part of the process – yes we would like to see more promotional material from HP but in the end, we have a thriving community involving many stakeholders and we have a voice. We should be using it far more aggressively – flooding every chatroom and community group we come across. Others are doing this and very successfully – so, what about us? Why should we be quiet?  NonStop is indeed a renowned server and NonStop has a global community that knows this! NonStop has a history and that’s important but having a history doesn’t imply a legacy solution – rather, it demonstrates flexibility and adaptability in a way few other systems can claim to have achieved.

Winter may be coming shortly – the signs of fall are more prevalent than just a few days ago.  The temperatures along the front ranges drop 20 degrees today and will hold at that level for the next week or so. But winter is a time of regeneration, a time to regroup if you like. For the NonStop community it’s also a period for user gatherings across the globe and yes, I will be attending several of them – from Philadelphia to Toronto to San Jose. In coming together as a group it’s time, too, to encourage and nurture – and to hear more good news about NonStop that will be the fodder for yet another year of blogging!

Thursday, August 28, 2014

A time to put the hammer down! NonStop accelerating adoption of Clouds and Big Data …

Do we really need to see the big guy wielding a mighty hammer, or are areas of focus for many in IT already being covered by smaller blows leveraging what we already have?


Boulder has been hit with some amazing thunderstorms of late, with another one passing overhead early this afternoon. Rather fortuitous in some respects as I took time to have a look – boiling clouds with amazing displays of nature’s powers as lightning flashes lit the sky. It was a big display of nature’s awesomeness. Lightning, disrupting the normality of an afternoon; clouds, the source of tremendous energy; no matter where you stood, the display was on a grand scale – big, by anyone’s measure.

At the macro level, prairie thunderstorms are majestic to watch whether you are out in the open or tucked up alongside a mountain. However, stepping away from nature there are still amazing sights to take in that are man-made. To me, the unrelenting drive to miniaturization blows my mind – there’s just no real way to equate what is inside a modern chip to anything we can see in the real world. When quantum physicists talk of size where 2 to the power of 512 is referenced in the same breath as being more than the sum of all atoms in the known universe, then yes, we have come a long way from a system my father showed me in the mid-1960s that had 8K bits of memory, yet still could store a program.

All of this is leading to the topic of the week and the two subjects are increasingly becoming related – Clouds and Big Data. A short time ago I started a couple of LinkedIn groups aimed at the NonStop community. Clouds, powered by NonStop, that now has 271 members and is a couple of weeks old and Big Data, integrated with NonStop that has fewer than 40 members but is relatively new.  Including NonStop in the names of both groups was intentional, as I wanted to make sure any new member of the NonStop community that searched for groups focused on NonStop would see that there is interest within the community in both Cloud and Big Data. If this is all new to you and as yet, you haven’t joined, then perhaps you should take a few minutes and do so.

While the community is concerned about the growing number of LinkedIn groups out there, its highly intentional on my part to project an image of NonStop that may surprise folks – of late I have become selective about the ones I do end up joining and the metric most important for me is the volatility of the group – just how much is going on. If you have seen some of the discussions and the passions they arouse on groups like Tandem User Group and Real Time View, you will know what I mean. Clouds and Big Data are very important for all stakeholders in the NonStop community. Many businesses have embarked on modernization initiatives so I have to ask you – do your modernization initiatives actively involve Clouds and Big Data?

As a place to start, did you know that vendors, well known to the NonStop community, have begun taking steps along the path to clouds. “The model today is cloud computing is inevitable even among the financial institutions I count as customers,” said OmniPayments, Inc. CEO, Yash Kapadia. “It’s inevitable because it represents a more sensible approach to providing the best value for the dollars spent. Too often we size systems and include software with expectations of the volume and type of transactions a customer will face but this can lead to erring on the conservative side with customers ending up paying more. And this is not what we want to do at OmniPayments. Elasticity of provisioning as well as providing capacity on demand is just the latest way to express a need for flexibility and with cloud computing, whether the cloud is on site and private, or is on our premises and managed, it is simply a way to better leverage commodity hardware and open systems.”

The productizing of the demo that has been showcased at the last couple of HP Discover events by Infrasoft as maRunga is just one more proof point. To those embarked on modernization projects where the volume of searches is high, as their clients spend more time looking up items, then maRunga is an easy first step to take on their path to clouds. And solutions for NonStop that include support of clouds are already being deployed. “OmniPayments has already been sold as a cloud offering in the America’s and is in an early stage of customer acceptance. In this first instance, there’s both a private cloud on site as well as a managed cloud on our premises for back-up purposes. And yes, NonStop systems are an important part of both cloud computer configurations,” added Yash. “Are we playing liberties with the concept of cloud computing? I don’t believe we are as the cloud we externalize to the customer is a collection of processors the specifics of which aren’t visible to the customer.”

The arrival of big data comes at a crucial time for the industry. In an always-connected, online world it’s imperative that we have the right information at our fingertips to sway our clients to more favorably consider what we offer. So often we emphasize two of the Vs associated with big data – volume and velocity – but timing is of the uttermost importance when integrating the world of big data with transaction processing, NonStop’s forte. “From our perspective, we should be talking about ‘time value of information’”, said Managing Consultant, Enterprise Solutions & Architecture at Hewlett-Packard, T.C. Janes. “Time is the enemy of data.”

“Big Data is driving much of the agenda of IT of late – business is fully aware that they need better insight into their business and that only comes when meaningful information can be extracted from the reams of data accumulating all around them,” said WebAction, Inc. Cofounder Sami Akbay. “However, the need isn’t something that can be addressed without understanding that relevant and applicable data, meeting business criteria, needs to be captured and repackaged for consumption by business logic central to the running of the business. When NonStop is taken into consideration this is particularly important as it’s often NonStop applications interacting directly with customers – the very community about which better insight is critical.”

I am particularly interested in the work that WebAction has taken to date and in delivering the WebAction product, those in the NonStop community embarking on a modernization revamp, have a very viable solution on hand,  and one that absolutely understands the importance of time. In the August 20, 2014, post 
Value Networking? Nonstop Community Certainly Does! to the WebAction blog, I wrote of how WebAction, by the very nature of the control it ultimately gives to system architects can generate web actions as verbose or as targeted as we need – it’s really the ultimate control gate when it comes to turning on the flow of big data to time-sensitive mission critical applications.  A really big hammer? Yes! A very small hammer? Yes, again!

“Certain types of data have enormous value at its moment of creation but may have less value an hour from now and perhaps no value tomorrow. Stock traders have understood this from time immemorial – if you can identify and act on a transaction opportunity faster than anyone else, you will benefit the most,” added HP’s Janes. “If you can synchronize data arrival velocity with business process velocity, an organization can sell more product, deliver better customer service and capitalize on new business opportunities sooner than their competitors.  Inversely, an organization does not necessarily want to act on every potential black swan “event” if their evaluation of aggregate data yields a contradictory model.”

There is another side to clouds and big data, and that’s the likely intersection of technologies. When it comes to big data and cloud computing then according to HP Master Technologist, Justin Simonds, “I think there will be a collision, at least for a while, as organizations attempt to derive meaning out of every piece of data they can get their hands on.  Cloud is the only ‘price reasonable’ way to churn through all that stuff.” Furthermore, added Simonds, “I believe there will be a falling away as attempt after attempt yields little business insight.  I see the industry adopting specific products for specific requirements. NonStop, as mentioned, is still extremely well designed for the operational analytic, real time velocity events. Vertica still holds the top magic quadrant spot for deep / big data analytics and for all its bad press Autonomy has some very good IP in the audio/video arena.  I believe multiple solutions are the only way to effectively proceed.”

Simonds observations are shared by comForte’s CTO, Thomas Burg. In an interview for an upcoming post to the blog, comForte Lounge, Burg observed that when it comes to “clouds and big data, there almost nothing that is really new here either. Cloud computing is little more than old wine in new caskets where the only issue is one of business best-practices. Can we run this application in a cloud? Can we store this data in a cloud? Security? Bring it on! We have the technology and know-how, we are happy to talk to prospective cloud adopters. Big data is a similar challenge but somehow, many NonStop users see only a big hammer when only a small hammer is needed.” On the other hand, “We continue to include NonStop in our plans for WebAction as there’s no question, serious online processing is occurring on that platform,” added Akbay. “To ignore NonStop would imply we aren’t taking the integration of Big Data with real time processing seriously and that simply isn’t the case.”

It as the Thor, who “put the hammer down”, as Captain America unfortunately requested of him in the film, The Avengers. Watching the lightning storms it wasn’t hard to imagine legendary beings as the source of such a spectacle of nature. Fortunately, when it comes to IT and NonStop, it’s a little less intense but every bit as important particularly as so many within the NonStop community are actively engaged in modernization initiatives. In time, of course, we will find broad exploitation of both technologies across the NonStop community (just as we did with Client / Server computing as well as with SOA and Web services that followed) with much of what is covered here generating little more than a yawn. And if you want to stay informed about what’s important to the community then yes, check out the new groups on LinkedIn and become a member – I look forward to all the comments you care to provide!

Wednesday, August 13, 2014

Sun, surf and sand – oh, the good-old-days!

With the sun beating down on my face and with a gentle ocean breeze stirring my remaining hair, I kept returning to thoughts of an independent NonStop company – free to pursue whatever it liked. But with the news that just broke … I’m not so sure this would be in the best interests of the NonStop community!

There’s something about the good-old-days that strikes a chord with many of us. For me, nostalgia is an emotion hiding just beneath the surface and perhaps it’s just a reflection of the many experiences I have had through the years. It takes very little to trigger a rush of memories bursting up from the depths, but a glimpse of the sun, a rolling surf and a sunny day usually does the trick! In the posts to this blog I have shared many of my memories and nearly always, glimpses of these memories were a result of something I had just read.

After two plus decades being involved with the NonStop community I have to admit that many of these memories are intimately tied to events that occurred while spending time with the community. The highlights of course were the years spent on the board of ITUG and I am very aware that those good-old-days are long gone, with little prospect of ever being repeated. Like almost everything else that gets tarnished by the label of legacy, I have come to realize that in today’s world where everyone is connected, relying on an annual big-tent gathering of the faithful for insight into product directions and technology adoption doesn’t need to exist, not in the same format it did for so many years.

Like many of the stakeholders participating in today’s NonStop community, we have watched the NonStop R&D group being trimmed. There’s been good reason for much of what’s been cut and the figures can be a little disconcerting. However, looking at NonStop R&D today it is clearly not comparable with what we remember back when it was Tandem Computers – the way HP is organized, tasks have been scattered throughout many groups. While it is good to know that the finances of the good ship, NonStop, have been righted and the contribution NonStop makes to HP’s bottom line is not something to quibble over, like everyone else I sure would like to see more funds allocated to NonStop.

Ah, the memories! They keep coming back even as I think back to Friday beer busts and afternoons in the Tandem pool. The printing of tee shirts for each new project and the Tandem television network with
First Friday videos – now available on YouTube – the sense of shared missions and the recognition that quarter after quarter, major enterprises were buying new systems. However, perhaps the best news of all escapes many of us. In a world of off-the-shelf commodity components, NonStop remains relevant.

When it comes to providing the highest levels of uptime, it’s still the halo product in HP’s portfolio – yes, as HP CEO, Meg Whitman, so succinctly summed up in a video at last year’s NonStop bootcamp, “Today, enterprises operate in a world where the demand for continuous application availability is growing exponentially. The need to choose the right computer for the right workload at the right economics has never been so important … we are on the path to redefine mission critical computing.” And, every bit as importantly for those attending, “Our NonStop customers truly make it matter!”

I have referenced this quote by Whitman several times this year – to posts here as well as in other blogs – and this quote remains as fresh in my mind as when it was first made. Choosing the right computer for the right workload at the right economics frames the discussion for NonStop now, and in the years to come. Step outside that framework and fail to meet any of criteria referenced, and the future for NonStop wouldn’t be as solid as it looks right now. Yes, we continue to kick around the impact that The Machine will have as the decade comes to a close, but it’s hard to overlook that even with a brand new OS, consideration will more than likely be given to attributes uniquely NonStop and to the enterprises that depend on NonStop.

Against this backdrop of NonStop and the memories I have of the good-old-days, it was with some disquiet that I read the news item in the August 1, 2014, edition of the Wall Street Journal. Under the heading of
Deal With H-P Paves New Future for Old Software the WSJ reported that HP had “agreed to let a small Massachusetts company – VMS Software - take over further development of OpenVMS, an operating system that originated at Digital Equipment Corp. in 1977. DEC no longer exists, but its technology has lived for years under HP’s ownership and still has passionate users.”

Furthermore, following a couple of announcements HP made last year to do with future ports of OpenVMS (to faster Intel chips) not happening, the WSJ  said the “H-P decision stunned organizations that use the software to run sensitive applications, in places like stock exchanges, manufacturing lines and chemical plants.” It then quoted VMS Software’s CEO, Duane Harris, as saying, “Everybody was in a panic,” and that users felt they “suddenly had no future.”

For quite some time there have been several stakeholders with lengthy ties to NonStop thinking that it may be a good thing to approach HP to see if there would be interest in splitting off NonStop and giving it to a company solely focused on its future. On paper, such an idea had merit for any entity with deep enough pockets to fund needed R&D “in perpetuity”. However, nothing developed and as I look at this story in the WSJ I am so glad nothing did eventuate. If there could be a stronger message to any community than yes, you are on your own, I don’t know of one and while we all harbor doubts about the performance of HP over NonStop, the good news is that NonStop remains an integral part of HP!

My loyalties have wavered through the years. Readers may recall my post of August 30, 2011,
Stories we could tell … I was standing in the offices of John Robinson, CEO of SDI (NET/MASTER) when I received offers from both DEC and Tandem (of course, electing to join Tandem Computers) and today it seems as though working on the fringes of HP was pre-determined! However, any thought that the future of DEC would finish up in the hands of a small software group in Massachusetts was unimaginable and yet, here we are today watching the winding down of a once mighty player on the computer stage.

There are a couple of sayings that come to mind at this point. Memories being what they are, I am not all that sure where I first heard them but family does come to mind. “Never set your goals too low in case you achieve them” is what I immediately thought of as I continued reading the WSJ story. For the NonStop community thinking big is still the objective and any thoughts I may have once had about the benefits of separating NonStop from HP are long gone. The simple truth is that to maintain a global reach and to ensure best usage possible of commodity items, it takes a company the size of HP to bring the resultant products to market in a cost-effective manner.

“The right computer for the right workload at the right economics,” seems such a simple observation and yet, with HP giving away OpenVMS as it has done (and as good a technology as OpenVMS had been), it apparently no longer was the right computer for the markets it served. IBM faces much the same dilemma with its midrange computers, the strangely morphed Power Systems (including what formerly was known as the System i and before that, the eServer iSeries and, going even further back, the AS/400 that has family ties back to the System/38 that appeared around the same time as the first Tandem computer) and speculation is rife that it too will end up in the hands of others apart from IBM. NonStop continues to retain just enough “special sauce” to differentiate it at times when even the most adventurous of us have thought NonStop surely couldn’t continue – but it does!

The good-old-days are gone and gone with them are the difficult times of programming complexities with limited connectivity. What I recall as being part of the good-old-days had little to do with technology and more to do with much of the social activities (loved the 1980s!) and as much as I muse about what doesn’t exist any longer there’s no escaping the leaps that have been made in productivity. The choice of platforms remains rich and the opportunities to innovate almost limitless and so, having NonStop in the picture, an integral part of HP and continuing to contribute. Even as I wish the folks at VMS software well, I no longer harbor wishes for NonStop to follow suit and look forward to better-new-days ahead! 

Monday, August 4, 2014

It’s the data, stupid!

I remain puzzled if not shocked to find so many members of the NonStop community simply ducking the issue. Raise your hands, crash meetings and do whatever it takes to be part of your company’s plans – Data from anywhere / everywhere is critical for the future of providing meaningful results to transaction processing!

Ever since the first book was published, I have been an ardent follower of the series of books, Dune, written by Frank Herbert. If ever there was a tale worthy of the talents of famed New Zealand director, Peter Jackson, then this is it – once he wraps up the latest Hobbit trilogy, perhaps he will turn his hand to Dune! Travelling across the galaxies? No problem, let’s just fold space! Lost a valued colleague? Not a problem either, just order up a clone! Central to the story line is spice – a very special spice that’s required to sustain the Mentat, “human computers”, as well as the Spacing Guild “navigators”.

For those not familiar with the story line of Dune, spice is the key that unlocks the universe as well as holds it captive; it’s at the very center of intergalactic intrigue and is only found on one sandy planet, Dune. Turn a page in any of the books, and there’s barely a paragraph without some reference to the planet and the treasures lying hidden beneath its sandy dunes. So it is too today, as we look at the work being done by the world’s data centers – highly valued business gems lying buried under data dunes.

If we were in the midst of a general election involving all in IT then it would be easy to sum up the issues of the day under the general declaration – it’s the data, stupid! So much of what’s driving technology is in response to the unrelenting upward escalation in the volumes of data that are being generated. No longer can we afford to simply throw data away. It’s scary to think of just how much useful business information fell through the cracks in former times just as it’s equally as scary to think of what we may learn if we pull into our daily operational lives even more data currently accumulating outside of our sphere of operations.  

“A number of years ago while running extremely high volumes of airline shopping queries through NonStop being monitored by Prognosis from IR; we were literally dumping valuable log data in the bit bucket. Locked within the log data was what the traveling public was asking for, what they were shown, and what they bought with regards to air travel,” observed former Sabre IT executive (and former ITUG Chairman), Scott Healy. “This stream of data, well actually more like a raging torrent of data, could have been analyzed and used to create actionable, real time business intelligence for airlines.” 

Data is attracting the spot light and dominating the stage of almost every technology event being promoted these days, and for good cause. What we see today is a greater awareness that answers lie in what we already process – connect the dots, dig deep into historical data and what others are saying, and a more complete picture can be painted. As Healy was only too keen to add, “What would have been interesting would have been to show the airline customers insights gleaned from a day of the log data, or perhaps a few days.  Find out what they were interested in (vis-à-vis willing to pay for) and what data should be summarized and stored for future trending analysis.”

A fall-out from recent tragic international events involving aircraft exposed just how much machine to machine communications were already happening. The network of sensors communicating operational information around the clock surprised many unfamiliar with the subject and yet, we have only just scratched the surface. Modern automobiles are producing enormous amounts of data as are the buildings and factories around us. Infrastructure from power grids to the distribution and subsequent treatment of water to the stocking of retailers’ shelves, it all involves data being generated and passed over networks that for the most part, are public.

While storage manufacturers are happy to see this unrelenting upward escalation in the volume of data being generated, there really are limits to just how much data any one enterprise ends up retaining. Policies are already in place that limit the amount of data retained to just a day, or a week, or even a couple of months. However, when it comes to national security, medical research and even public records, arguments can be made to keep every bit of data that passes through the processing environment. After all, forensic mining of data for greater insight into trends has become an industry in its own right.

For the NonStop community this is proving more than a fad. It’s vitally important to all parties that better insight is realized, whether it’s insight about the business, about the market, or even the IT operations itself, acting insightfully is not just an innovative move but rather mandatory one for most enterprises and it’s only with access to data – all the data – that this degree of insight can be achieved. In my most recent discussions with vendors this isn’t being overlooked and has become the subject of initiatives aimed at better addressing this requirement.

Yes, it’s the data, stupid! So no, it’s not acceptable to “log data in the bit bucket”. However, it is equally not acceptable to blindly store everything without due consideration given to the value. This is something I touched on in my past post when I noted how, for many enterprises where NonStop systems are relied upon, the trend is towards “capture, analyze, store”,  with mechanisms in place that allow pertinent data analyzed and even stored to be reused by processes at the time additional data is being captured.

I hadn’t yet joined Tandem Computers when the news began to percolate through the industry that Tandem was working on a permanently available implementation of SQL. Overseen by the late, great, Dr. Jim Gray, NS SQL challenged many traditional beliefs about SQL implementations – it’s mixed workload capabilities that allow NS SQL to keep on processing even as DBAs run all-important statistics (needed to keep SQL “in tune) or, as I reported a few years back with respect to one large NonStop user, “maintenance? Truly, we run reorgs, statistics, splits, column adds, etc. all without taking anything down. It’s the NonStop fundamentals!”

NS SQL did exploit the fundamentals but all those years ago, Gray was aware that IT dynamics were evolving and that there would be even more data to process, and that transactions themselves would be even more data intensive. Indeed, after the passing of Gray, Microsoft announced that they would create the eScience Award for the researcher “who has made an outstanding contribution to the field of data-intensive computing”.

Furthermore, according to Microsoft, “In a lecture he delivered just 17 days before he went missing, Jim outlined the increasingly important challenge and opportunity afforded by the availability of previously unimaginable volumes of data and continuous research dedicated to creating new understanding of the world around us.” In other words, even the acknowledged father of NS SQL recognized that ahead of us would be an unrelenting upward escalation in the volume of data and, with these unimaginable volumes of data, the opportunity to provide unimaginable insight into all that transpires in an enterprise.

“Augmenting a transaction and giving it access to additional information for greater insight,” WebAction EVP, Sami Akbay, advised in the last post of last month, I need data I can digest, in small bites, please! before adding, “seems to be a reasonable request and one we are taking seriously.” I am referencing WebAction more often these days as I pursue the story of data and it’s no coincidence that the executive team is made up of former GoldenGate executives. Nor is it a coincidence that they see the need for greater integration of data from disparate sources being a logical next step following data replication and data integration. Like Gray, WebAction appreciates that to fully understand the business world around them, the importance of data cannot be understated.

Mobility, IoT and M2M exchanges, Big Data and Clouds are all involved in data – generating data, processing data and storing data – and it should come as no surprise to anyone in the NonStop community to read of the innovative solutions under development that continue to include NonStop systems. IR, comForte, WebAction and OmniPayments are just a few of the vendors I know first-hand are involved in some aspect of data-intensive computing. We can take our eyes off numerous initiatives and not be hurt too badly but if we become blasé about the importance of data, we run the risk of being blind-sided by our competitors.

Yes, it’s most definitely all about the data, stupid – so pay attention, as there’s plenty of data suggesting this is a development not to be ignored! The picture used (above) wasn't taken at some desolate spot on the earth but rather, nearby; tucked under the shadow of the Rocky Mountains about 200 miles from Boulder and a must-see for even the most casual tourist. And who knows what treasure lies beneath those sandy dunes. And likewise, there are gems aplenty buried within those monstrous data dunes rising from the floors of today’s data centers! 

Tuesday, July 22, 2014

I need data I can digest, in small bites, please!

When it comes to the NonStop community discussing the potential benefits from Big Data often results in conversations about taking baby steps and where less big may be the place to start! 

How many times have we heard the expression “biting off more than we can chew”? Whether it applies to household chores, car maintenance, or simply arranging our next vacation, there’s always something that comes up that belays expected benefits and rewards. With summer and the time for BBQs, this expression often comes to mind.

There’s no denying that IT is littered with failed projects and the more I talk with IT professionals, the more I sense that we continue to bite off more than we can chew. Even among the NonStop community, it’s not uncommon to hear of projects, from simple modernization efforts to major system upgrades, coming up short and often simply abandoned.

This week, a couple of emails arrived that had me thinking about past failures and they were all about Big Data – multiple invitations to read papers, participate in surveys and join webcasts. While it’s true that Big Data is among the most talked about topics across IT, it’s also a little overwhelming and the opinions from experts do little to lessen our anxieties. For a while it seemed that the NonStop community could pull its head in on Big Data, but now companies are wanting to elicit marketing “truths” as they develop in real time and NonStop is in the cross-hairs of every business manager targeting exploitation of Big Data in real time.

First up, I received an invite from HP to join HP CEO, Meg Whitman, and her team working on Big Data “as they discuss the pressure to extract valuable insights from your data even as the volume and variety of data being collected significantly increase.” There’s those two critical V’s again that along with variety (and occasionally veracity), help define Big Data. However, this wasn’t the item that caught my attention. It was the reference to there being “the pressure to extract valuable insights” within IT as it continues to store all the data it captures. So much in fact that apart from those selling storage, nobody seems all that sure if there’s anything good going on at all!

Secondly, there came an invitation from IBM to “tell us what it takes to create a data-driven competitive advantage”. When you dig a little deeper into the invite, IBM let’s on that it is “seeking a cross-industry, global pool of respondents who have business or technical responsibilities for analytics activities.” What followed was a survey IBM created from which it hopes to be able to get a better sense of what business really does need.

The third email was a link to a HP post,
No limits: How Big Data changes competition - Data drives the bottom line, and technology is no longer limiting your competitors. “Data-focused businesses have rejected the concept of compromising business objectives for the sake of technology,” said HP. HP said it after noting that “For organizations that weren’t ‘born’ in the era of no-limits technology, a transformation is required.” Furthermore, HP adds, “It’s a novel idea for most organizations, but it’s in the DNA of young, agile companies.” And, “To compete, the rest of the market will need to act urgently to change their data ideologies and reject limitations as they store and explore data, and serve analytics insights to the business.”

The one thing I believe we can all agree on is that everyone has an opinion about Big Data, but here’s the problem as I see it; is this simply a case of there being more Big Data than we can chew and if so, is what we would really like actually less Big Data? Very few enterprises today feel comfortable with committing to three or five year projects and are more interested in immediate gratification – so, can we reduce Big Data into bite size chunks and indeed, can we tighten the focus data-driven apps and apply to lesser, more mundane tasks?

But again, here’s the rub and I have written about it in other posts – Big Data is not an off-the-shelf, one-size-fits-all solution. In fact, it’s not really a solution but rather a tool that, in the hands of knowledgeable business scientists, can perform amazing feats and deliver incredible business insight. For most enterprises I know, such business scientists are a non-entity. Where they do exist they are a part of a vendor and are focused on making sure product A outperforms product B. And, as an aside, I am pretty sure that, over the long haul the model of capturing then storing then analyzing data is not a sustainable model – capture, analyze, store seems to be a little more workable for the enterprise.

Among the NonStop vendor community there’s already inklings of what lies ahead – solutions and middleware vendors are beginning to add capabilities that tap Big Data. IR with Prognosis has already confirmed that it’s at vendors like IR “where you will find the data architects, statisticians and data scientists that can decipher what’s taking place from the myriad of events coming from multiple sources.” In the May 6, 2014, post to realtime.ir
For business managers today, situational awareness is critical … IR’s Jay Horton then added how, “The Business Insight provided by IR, is as big as the customers want it to be, and is limited only by their perspective on what is important to measure.” Alerts produced before a crises starts is always more preferable than an alert that simply states the obvious – you have been robbed!

As for the solution vendors, it’s hard to miss the messages coming from OmniPayments CEO, Yash Kapadia. According to Yash, in the March 12, 2013, post to ATMmarketplace
You don’t have to make it to 6th grade to know EMV support is the smart thing to do! ,when the discussion with any prospect turns to fraud, then “my experience tells me that work being done in adjacent technologies such as analytics and the accumulation of data in Big Data frameworks is going to be leveraged by payments platforms in new and innovative ways. Legitimate customer experience cannot be compromised, of course, but the need to detect potential fraud as it is being perpetrated has become of paramount importance to all in the financial industry.”

Expressing similar sentiment to those by IR, Yash also notes that across his constituency there’s “little expectation that customers will have the ‘Big Data’-skilled scientists at hand and are looking to OmniPayments to capitalize on Big Data (where it’s in place and accessible) and furthermore while I am not expecting to see Big Data implementations on NonStop systems I am expecting to be a consumer of Big Data analytics generated in real time.”

“No transformation required” is well understood at WebAction. “To be completely honest, when it comes to enterprises relying on NonStop systems for mission critical transaction processing,” said WebAction Executive VP, Sami Akbay, “we know of no instances where embracing the value of Big Data has seen such an enterprise replace their NonStop system. Quite the opposite, in fact, augmenting a transaction and giving it access to additional information for greater insight seems to be a reasonable request and one we are taking seriously.”

While WebAction isn’t changing the underlying technology of its core product, the messages in support of WebAction have become more tightly focused of late – visitors to the
WebAction web site can now view a number of examples that are of interest to any enterprise running NonStop systems. “Whether it’s just a case of optimizing data center management along the lines IR has recognized or enhancing security event processing where OmniPayments sees concerns, we are seeing enterprises building, modifying and then deploying real-time apps in days and not months or years.”

Should you be interested in following the discussion about Big Data and its intersection with NonStop and real time transaction processing, you may want to join the new LinkedIn group, Big Data, integrated with NonStop. While there is no Connect SIG supporting Big Data at this time, this may do as a substitute.

Biting off more than you can chew has plagued IT for as long as IT has existed. The five-year Mega projects that make headlines have a poor history of completing on time and meeting expected requirements – not hard to imagine as IT just hasn’t ever stood still for five years. Vendors selecting elements of Big Data they see as helpful and enhancing the capabilities of their products also makes sense and is something easily comprehended. New vendors building tools to enhance the experience of consumers interacting with mission critical real time applications is also easy for most of us to swallow.

Less big might be right up there with giant shrimp and user friendly but to those who have witnessed change within IT over decades, breaking down a new technology into bite-size chunks makes sense. Big Data is above all else, big! And as such, demands transformation even as it favors those companies starting from scratch. However, when it comes to the NonStop community and the enterprises they serve, the risks associated with ripping and replacing isn’t attractive and seeing Big Data arrive incrementally is a godsend!