Skip to main content

Hybrids, Connectivity, Analytics and Natural Language Processing … NonStop, of course!

What was on display at HPE Discover 2016 led me to take a closer look at some of the start-ups where Pyalla Technologies has a presence and the results add further weight to the argument about just how far NonStop systems have come – who would have guessed just a year or so ago! 

Wandering the exhibition hall at this year’s HPE Discover, for me, it was a treat to see as many cars on display as there were. Whether it was the HPE sponsored Formula E or what I recall was a Nissan Leaf or even the BMW i3 pictured above, there seemed a lot more engagement with cars at this show. Of course, there was the opportunity to race a car against other attendees; that was part of the HPE Labs exhibit but the “controls” were a tad artificial and not to my liking, so I wasn’t tempted to join in the fun.

The BMW i3 is a pure electric play and where HPE was involved had to do with gathering information from many sources in order to keep the car on course and in practical terms, safe. The i3 had communications to the HPE IoT Platform where the integration allowed “for rapid installation of new services or applications, as well as for communication with nearby connected devices.” What this integration covered was, “Advance warning of rain, high winds, and potholes using “swarm” intelligence; Smart home/infrastructure integration and Geo fencing to alert drivers to restrictions as they cross international borders.”

Of course this was all complementary technology to whatever autonomous car technology develops but it was a graphic way to demonstrate a future where unheralded volumes of data are routinely examined with potentially important data communicated to where it’s most wanted. For the NonStop community all of this may be peripheral to the core function of processing transactions, but for many years we have heard presentations by classic bricks-and-mortar retailer, Home Depot, of how it has integrated weather feeds from the internet directly into its transaction processing on NonStop systems in order to ensure the right merchandise is in the right store when local weather crisis develops. And yes, this is only just the beginning.

Part of what Pyalla supports are a number of start-ups where my involvement has run the full gamut, from product management and marketing to business development to simply supporting media outreach programs. One start-up centers on NonStop while another includes NonStop. The third is not on NonStop, even as it is a huge user of HPE products. However, there is a synergy evident in these start-ups. They are all bridging the old with the new and helping us get a firm foothold in solutions that will allow us to tap vast resources, no matter how the data is captured, processed or stored. Given the renewed focus on NonStop within HPE, the associations I have with each of these startups is likely a foretaste of what the NonStop users will become very interest in over the course of the next 18 to 30 months. And the catalyst for all of this interest has been the arrival of hybrid infrastructures where NonStop X is playing its part.

My involvement with
InfraSoft dates back to its earliest days as the company came together. Based in Sydney, Australia, it was only natural for Pyalla Technologies to remain engaged with a company that influenced the choice of company name – Pyalla. Following considerable market success with its uLinga communications product suite, it has been its deep port of Node.js that has really interested me. So many applications written in JavaScript have to do with processing data that it’s not surprising to see the need to support Node.js’ gain as much momentum as it has.

For those who participated in the presentation by HPE IT at HPE Discover 2016, we heard of the choice of JavaScript and Node.js underpinning their applications (with JDBC access to SQL/MX on NonStop) and this is only the beginning. The processing of voluminous amounts of data and then doing something with the data that is of interest is right at the center of the sweet spot for JavaScript and with as much talk as there is of late about microservices, it’s good to see that there is a solution for NonStop X systems and whereas Node.js may be thought of as a platform running on Linux as part of a HPE hybrid infrastructure, the addition of NonStop can only elevate the importance of Node.js for many users where NonStop has a presence.    

When it comes to data and data analytics however, in my opinion the premier vendor in this space that should be on the radar screens of every NonStop user is
Striim. Originally called WebAction and established by former GoldenGate Software executives, the transition from simply supporting data integration as was being done by GoldenGate to where the data itself was the interesting element shouldn’t be a surprise to anyone in the NonStop community.

The first PoCs among NonStop users are being done and shortly there will be news about a number of successful use-case scenarios everyone in the NonStop community will be able to relate to – Striim is synonymous with data stream analytics and there will not be a transaction process solution operating anywhere without the deep ties to the environment that Striim so effectively supports. Like InfraSoft, Striim is platform neutral where Linux is perhaps the more viable choice of platform making it an even stronger candidate for running on HPE hybrid infrastructure.    

Whereas my ties to InfraSoft are deeply rooted in my ties to all things Australian and my ties to Striim are associated with my good times at GoldenGate, my connection with InkaBinka can be traced directly to time spent at Starbucks, Wood Ranch, Simi Valley. Chalk it up to unintended consequences or simply to serendipity, but from the time I met InkaBinka founder, Kevin McGushion, over a Starbucks’ Latte, I was hooked. And today, what started out at InkaBinka with its news application, a neat way to read news summaries while standing in line for that Latte has developed into a serious piece of raw feed processing utilizing very advanced natural language processing. 

Or, as Kevin recently explained it, “InkaBinka has created a state variable, neural network that can summarize vast amounts of information by writing about a subject, emulating human abstraction. This is especially powerful in internet search, where 300 million results are common for a single search term. This neural network, through learning and building of a summary, creates new ways of visualizing information while allowing rapid discovery of new information that would have been hidden by the sheer volume of content.”
 

For me it’s a case of InkaBinka developing an optimized neural network that learns over time and creates and adjusts relationships between ideas, something Kevin now calls “connectedness,” as those ideas evolve. InkaBinka then uses a form of neural network that relates more abstract ideas to summarize large volumes of data. “Artificial neural networks at a basic level emulate the function of neurons where, if new information is introduced, processes may be applied to learn about those new things, essentially allowing the system to become smarter,” said Kevin. 

And of course, from a high-level, macro perspective, “It’s hard to apply these concepts to abstract things such as language. Traditional NLP (natural language processing) works sequentially where a next state is dependent on a previous state and works well for things like spell checking, grammar checking, translation even code breaking. This is not how the human brain tends to work.”

Before diving even deeper into what was behind InkaBinka today, by way of explanation Kevin then suggested that I should, “Take, for example what the human mind does when it begins to formulate a sentence, a series of sentences or a thought. It does not create each sentence with each word sequentially. Rather, it takes into account the main ideas to be covered and the order in which they are to be covered. This may depend on a number of things, such as most important concepts, most recent concepts, or even bias. The mind then takes those main ideas in a prescribed order and weaves them together with individual words, taking into consideration what has been said and what is left to be said.” 

In the example below (and now viewed many times on numerous social media sites), 10,024 websites may be learned from in order to create a summary and a connectedness map of basic ideas covered in these sites. Any node of the connectedness map may be selected and a new summary created based on the relationship between these words.  Information can be quickly discovered without the need to sort through page after page of internet searches, representing a ‘wall of text.’   



It is often a mystery to a user how internet search engines relate or rank information but now, with InkaBinka search, the connectedness is graphically understood and manipulated. Like for many in the NonStop community, visualization works best and the above goes a great way to explaining the powerful transition under way inside InkaBinka. 

InkaBinka runs completely on HPE Moonshot processors and is implemented using JavaScript and Node.js. And it’s heavily focused on data, search and visualization – all sounding rather familiar given the interests of Pyalla. When it comes to hybrid infrastructure we have already seen SLQ/MX on NonStop X being accessed by microservices on Linux (at HPE IT), so thinking in terms of NLP on Linux being accessed by transaction processing on NonStop doesn’t represent too big a stretch of our imagination.

With the looming presence of virtual NonStop, running on commercial, off-the-shelf hardware, where all that is required is a Linux and KVM (think OpenStack) perhaps there will be those looking for an even lower cost hardware option where Moonshot may become an option to consider. If not Moonshot – how much attention is being paid to HPE CloudLine as here too, there is another alternative for use with virtual NonStop. In other words, it would be unwise to place any fences around HPE and NonStop as to where it’s products will surface or is it wise to rule out participation in a solution by any one product.

Microservices, data analytics and natural language processing, all are involving NonStop today. In many ways, too, pointing to where NonStop is headed as well as to the types of platforms we will likely see NonStop becoming a part of. The support Pyalla is providing is not accidental nor is it random and even though these associations developed several years ago where, as the time, their likely impact on NonStop transaction processing may have been dubious at best and a completely off-the-wall at worst, it’s symbolic in many ways of just how wide the net is now being cast when it comes to future possibilities for NonStop.  HPE Discover 2016 opened many eyes not the least being my own. The question now for the NonStop community is simply one of how open are your eyes! 

Comments

Popular posts from this blog

If it’s June then it’s time for HPE Discover 2021.

  For the NonStop community there has always been an annual event that proved hard to resist; with changing times these events are virtual – but can we anticipate change down the road? Just recently Margo and I chose to return home via US Highway 129. It may not ring any bells, but for those who prefer to call it the Tail of the Dragon – 318 curves in 11 miles – it represents the epitome of mountain excitement. For Margo and me, having now driven the tail in both directions, driving hard through all these turns never gets old. Business took us to Florida for an extended week of meetings that were mostly conversations. Not everything went to plan and we didn’t get to see some folks, but just to have an opportunity to hit the road and meet in person certainly made the 4,500 miles excursion worthwhile. The mere fact that we made touring in a roadster work for us and we were comfortable in doing so, well, that was a real trick with a car better suited to day trips. This is all just a p

The folly that was Tandem Computers and the path that led me to NonStop ...

With the arrival of 2018 I am celebrating thirty years of association with NonStop and before that, Tandem Computers. And yes, a lot has changed but the fundamentals are still very much intact! The arrival of 2018 has a lot of meaning for me, but perhaps nothing more significant than my journey with Tandem and later NonStop can be traced all the way back to 1988 – yes, some thirty years ago. But I am getting a little ahead of myself and there is much to tell before that eventful year came around. And a lot was happening well before 1988. For nearly ten years I had really enjoyed working with Nixdorf Computers and before that, with The Computer Software Company (TCSC) out of Richmond Virginia. It was back in 1979 that I first heard about Nixdorf’s interests in acquiring TCSC which they eventually did and in so doing, thrust me headlong into a turbulent period where I was barely at home – flying to meetings after meetings in Europe and the US. All those years ago there was

An era ends!

I have just spent a couple of days back on the old Tandem Computers Cupertino campus. Staying at a nearby hotel, this offered me an opportunity to take an early morning walk around the streets once so densely populated with Tandem Computers buildings – and it was kind of sad to see so many of them empty. It was also a little amusing to see many of them now adorned with Apple tombstone markers and with the Apple logo splashed liberally around. The photo at the top of this posting is of Tandem Way – the exit off Tantau Avenue that leads to what was once Jimmy’s headquarters building. I looked for the Tandem flag flying from the flagpole – but that one has been absent for many years now. When I arrived at Tandem in late ’88 I have just missed the “Billion Dollar Party” but everyone continued to talk about it. There was hardly an employee on the campus not wearing the black sweatshirt given to everyone at the party. And it wasn’t too long before the obelisk, with every employee’s signature