Skip to main content

Moving forward - transformation and virtualization makes testing of the business logic even more critical

When we think of virtualization and the coming of clouds and as we consider all that may be involved in transforming to these hybrid combinations incorporating the traditional with the very new, how often does the testing of our applications come to mind?

There have been times these past few days where events have reminded me of practices and disciplines that dominated our discussions in former times. I had the misfortune of breaking things and working with insurance companies and I was left without access to more modern methods of communications to the point where I was asked if I could possible find a fax machine so I could receive a fax.

It was in the early 1980s when the vendor who employed me back in Sydney, Australia, installed a fax machine in my office and I no longer had to take the long walk over to the telex machine where I would have then spend hours preparing a paper tape for transmission back to my head office in Richmond, Virginia. In many ways it was a sad occasion as I had really mastered the telex machine and yet it was progress, given how easy it became not only to transmit the written word, but pictures charts and graphs as well!

Fast forward to today and the power of the mobile phone is undeniable. We can communicate with anyone we want to, at any time, about anything at all. In a couple of recent conversations the talk has led to consideration of whether the mobile phone was about to fade from the scene, to be replaced by even more spectacular technology and whether or not we were entering essentially an era of magic. How else can you explain away the knowledge so many businesses have about everything we do? And yet, even with the most advanced forms of communication there will still be a need for apps to support inquiries as well as many different models used for purchases and other financial transactions.

Point is – we still write code and as much as AI continues to advance there remains a need for humans to remain very much involved in stringing together the logic that drives decisions for success. When we talk about clouds we talk about the elasticity of provisioning that addresses both the needs we have for data storage and business logic. But here’s the rub – we are working diligently to be able to store vast amounts of data even as we continue to write logic practically unchanged from how we did it in the past albeit, a lot more quickly of course.

Let me take you to an earlier time, decades ago. In fact, many decades ago, to when we first started coding the computers that marked the beginning of our adventure with computers. I was recruited by IBM on the campus of Sydney University at a time when I was becoming very bored with academic life. At the time I wasn’t really aware of the implications of my decision to participate in a series of tests the University sponsored but it was only a matter of months before I found myself on another campus; this time, it was the operations center for a steelworks in Wollongong, Australia.

The year I was recruited was the southern hemisphere’s summer of 1969 and my first day on the job was 1970, so effectively I have been looking at code for almost six decades. And the fundamentals haven’t changed, just the timeframes. Ambitions? Well, my first job was to develop applications in support of a new steelworks that was being built but along the way, I was tinkering with the operating system as for a period of time the IBM mainframes the steelworks purchased didn’t have enough main memory to run any IBM operating system so we pretty much came up with our own – just a couple of lines of IBM 360 assembler code together with a bunch of macros.

Timeframes? Well this is where the dramatic changes can be seen, perhaps more so than when it comes to chip power and Moore’s Law. I was writing just one application a year – perhaps a little bit more. I grabbed a coding pad, wrote assembler instructions for the logic I was pulling together to solve a business problem. Pages and pages of assembler code that was then submitted to the data entry folks who oftentimes took a week or more before they returned to me the coding pages along with a box of punched cards. I kept running these decks through the assembler until I got a clean assembly at which time I took the object deck and began to test.

As a matter of practice, we always left an addressable piece of storage (of about 100 to 250 bytes) that if my logic went awry, I could branch to, throw in a couple of correcting statements, and return to the mainline code. Ouch – yes, almost every production application was supported by a series of supplementary corrective card that steered the logic back to where it needed to be without having to reassemble the whole application, or worse, send the coding pages back to the data entry team.

Testing? For my applications, which supported what we called the “online application” I would often resort to booking solo time on the mainframe and dialing in “single cycle” so I could manually step through each instruction and watch the results via the console display lights that changed with the execution of each instruction. Productivity? Wow – I could debug my programs more quickly than others working with me who preferred to go home at the end of the day.The company had enough programmers to complete the implementation of the new application for the steelworks about to be commissioned so it seemed reasonable to function this way. Looking back at what we did all those years ago I am not surprised that applications often stopped but rather that any of them ran successfully at all!

Now let me fast forward to practices of today – attempting to develop and test applications and then ensure that they were maintained same way as we did all those decades ago is not only not possible but runs contrary to the always-on, always-connected 24 X 7 world we live in as we remain tethered to our mobile devices plugging away at the latest app. Languages and development frameworks have changed. We don’t simply write code, we pull code from multiple sources and practically assemble a program that in turn is just a part of an application designed to address a specific business need.

Providing defect-free applications at a fair cost, particularly when these applications have to accommodate today’s multi-vendor and hybrid environments even as they have to be aware of the many regulatory and compliance mandates for each industry needs something a whole lot more sophisticated than simple access to a system that can be set to single cycle!  And I was reminded of this only a few days ago when I had a conversation with folks at Paragon Application Systems. These are the folks who have developed the premier testing solution for the payments industry.

“It’s all about continuous integration, continuous delivery and yes, continuous testing,” I was told by Paragon CEO, Jim Perry. Integration, delivery and testing is a never ending cycle, for the life of the program and application, performed in a seamless manner whereby the state of the program or application is always current and correct. “The growth of our global economy has created payment systems that have grown too intricate and change too quickly for any organization to risk deployments without frequent, comprehensive regression testing. No company can hire enough people to manually perform the testing necessary in the time available within a release cycle. Automation of the software build and delivery cycle, as well as test execution and verification is required.”

Manually perform testing? Grown too intricate? For the NonStop community there has always been concerns about the business logic bringing a NonStop system to a halt. And for good reason! Fault tolerant systems have been designed to keep processing even when facing single points of failure, but oftentimes, poorly implemented and tested business logic can get in the way! Unfortunately it’s about to get a whole lot worse as testing not only has to ensure the application is defect free but that the underlying platform, now being virtualized, is configured in a way that NonStop applications can continue being NonStop.

We have virtualized networks and we have virtualized end points and this has helped considerably with automating our test processes but now the platform itself is being virtualized and this is a whole new ball game for many enterprises IT shops. And this makes the need to have something like Paragon on hand even more important – we have stopped manually checking anything these days so we cannot start now. In the coming months, as we continue to look at the transformation to hybrid IT and to virtualization and the software-defined everything I am planning on devoting more column inches to testing as all too soon our inability to thoroughly test what we are turning on in production could bring many a data center crashing down.

If as yet you haven’t looked at Paragon then you may want to visit the web site and download a couple of papers as I have to believe, for those of you in the NonStop community who are only vaguely familiar with how testing has changed, particularly when it comes to testing for payments solutions, it may very well be an opportunity to rethink just how comfortable we are with the processes we have in place today. And wonder too, how anything worked at all back in the days when it was all performed manually!

Comments

Popular posts from this blog

If it’s June then it’s time for HPE Discover 2021.

  For the NonStop community there has always been an annual event that proved hard to resist; with changing times these events are virtual – but can we anticipate change down the road? Just recently Margo and I chose to return home via US Highway 129. It may not ring any bells, but for those who prefer to call it the Tail of the Dragon – 318 curves in 11 miles – it represents the epitome of mountain excitement. For Margo and me, having now driven the tail in both directions, driving hard through all these turns never gets old. Business took us to Florida for an extended week of meetings that were mostly conversations. Not everything went to plan and we didn’t get to see some folks, but just to have an opportunity to hit the road and meet in person certainly made the 4,500 miles excursion worthwhile. The mere fact that we made touring in a roadster work for us and we were comfortable in doing so, well, that was a real trick with a car better suited to day trips. This is all just a p

Three more wishes coming soon – the path ahead for NonStop.

So, another three years have passed by and I find myself writing a preview of what I will likely focus on in eighteen months’ time – my next three wishes for NonStop! It wouldn’t be fair on my family if I said 2019 had been a routine year for Pyalla Technologies. It started with the return flight from Sydney, Australia, and continued with three separate trips to Europe plus a lengthy road trip to Las Vegas for HPE Discover 2019 combined with stops in southern California and participation in N2TUG back in Texas. The miles have added up but all the while even as the adventurous life continued to unfold, there was so much news coming out of HPE that scarcely a day passed without a discussion or two over what it all means. Margo and I have our roots firmly anchored in NonStop, dating back to Tandem Computers where Margo had risen through the development organization all the way to the COO role under the stewardship of Bill Heil when Bill headed the NonStop Software BU. As for me

ACI Strategy - it's all about choice!

I have just returned from spending a few days in Omaha attending the annual ACE Focus meeting. These two day meetings provide more in-depth technical coverage than is usually found at the regular ACI user events, and ACI customers have been coming for more than a decade to hear the messages directly from company executives. The picture I have included here is of the venue of the Wednesday night social event – a reception held at a local sports bar called the ICEHOUSE. And I found this extremely ironic as my own involvement with ACI came through my association with the ICE product. For most of the ‘90s, ACI had been the global distributor for ICE and then, as we began the new millennium, ACI purchased Insession, creating a separate business unit that it named Insession Technologies. For nearly six years, as part of ACI it enjoyed a successful partnership with the NonStop community and had provided a number of solutions in communications, web services, and security. But the decision in l