Let’s face it: decades-old myths die hard. The team at MythBusters gets an hour to make their case; there’s no way I can completely bust this myth in a couple hundred–word blog. But I can introduce you to some new ways of thinking, and if you follow up on these threads I bet you’ll eventually declare this myth busted on your own.
In Episode I I discussed business decision optimization opportunities that can only be realized by using real-time analytics. Unfortunately, most of today’s IT infrastructures are not ready to support real-time analytics because they were originally set up to support “offline” analytics: periodic reporting designed to inform human decision-making processes. Because analytics were not integral to operational business processes, operations and analytics developed as two very distinct IT lines—with operational data being copied and transferred to distributed systems for analytics processing.
This fragmentation of data and separation of processing architectures is an inhibitor to supporting analytics that automate time-sensitive decisions within real-time business processes. But don’t just take my word for it: I invite you to read a new report from Forrester Consulting that provides insights on analytics from over 200 executives in the financial services sector (you can also replay a webcast on this topic from Forrester’s Brian Hopkins).
The data from this report backs my view that most enterprises need to realign their IT so that they move from being an organization that supports operations and analytics to one that supports operational analytics. Here are five points to consider as you move forward.
Focus on the Source
All decisions are derived from a variety of data, but one source will often dominate. The Forrester report indicates that executives are most focused on their transactional systems and expect these systems to drive more data growth than any other source (for example, social media, mobile and so on). If your source transactional systems are on the mainframe, doesn’t it make sense to begin your decision management journey there?
Location, Location, Location
Data warehouses are necessary to bridge the gap between row-aligned operational data and the columnar format best suited for analytics. You probably wouldn’t site a physical warehouse in an unsafe area just to save money, so why risk exposing your critical data by sending it outside the mainframe for processing? Perhaps because companies like Teradata, Oracle, EMC, Sybase and others have built their businesses by telling you that it’s too expensive to use the mainframe for warehousing data. Maybe this was true once, but the cost of operating a data warehouse on the mainframe is no longer prohibitive. Take a look at the IBM zEnterprise Analytics System, a highly competitive offering that lets you keep your warehouse safe, secure and in close proximity to your source data.
And distributed analytics systems may not be as inexpensive as you think. The Forrester study found that 70 percent of the executives surveyed felt that separating transactional and analytic systems increased their costs, and 60 percent felt that sharing data across platforms is excessively expensive. As a small illustration, consider that an internal IBM study calculated the typical four-year cost just to transfer and house z/OS data on distributed systems for analysis at over eight million US dollars. House your warehouse right along with your source data; it’s affordable and it eliminates the expense and risk of moving data.
Keep it fresh!
Predictive decisions require statistical modeling and scoring. In order for a score to be accurate, both the modeler and the scoring engine need access to the freshest data possible. In 2012, IBM introduced advances in IBM SPSS Modeler V15 and DB2 V10 for z/OS that allow scores to be calculated directly within the DB2 z/OS database (this brief article gives a good overview). In-transaction scoring is currently unique to the mainframe, and we have already baked this technology into anti-fraud solutions for several industries; check out our System z enterprise solutions page for more details.
Keep it simple!
Since complex queries can bring an operational system to its knees, such queries are typically rejected, held for off-hours processing or moved to off-platform warehouses. Not only does this inhibit business effectiveness; it also introduces significant costs and planning.
IBM’s Netezza appliances can help accelerate queries. But the IBM DB2 Analytics Accelerator takes this technology one step further by deeply integrating it with DB2 z/OS, providing a single integrated system for processing both normal and complex queries safely and efficiently. It’s a simple, cost-effective way to integrate operations and analytics.
Consolidating your operational data, warehouse and query acceleration within the scope of the IBM System z gives you a unified foundation for decision management. It can also be cheaper than what you’re doing today; an internal study of a real customer environment showed that complex analytic queries performed against an IBM zEnterprise Analytics System, with the IBM DB2 Analytics Accelerator, measured 26x throughput and 33x price/performance improvements compared to their existing competitive environment.
Keep it fast!
Many people believe that the RISC processor architecture is required for performing analytics. While this may be true for some forms of computation (think IBM Watson), the System z processor has evolved to the point where it can easily handle the sort of operational analytics required by decision-management systems.
At its announcement, the current-generation IBM zEnterprise EC12 featured the industry’s fastest chip: a six-core design with each core running at 5.5 GHz. That’s fast! This machine boasts 101 configurable cores, second-generation out of order execution design, multilevel branch prediction for complex workloads, large caches and a host of features designed specifically to facilitate operational analytics. The zEC12 has established itself as a premiere host for decision-management systems.
In this post I’ve just been able to scratch the surface in showing that yes, you can (and should!) do analytics on the mainframe. For more information and proof points, please check out the mainframe business analytics and data warehousing page.
Paul DiMarzio has 30+ years experience with IBM focused on bringing new and emerging technologies to the mainframe. He is currently responsible for developing and executing IBM’s worldwide z Systems big data and analytics portfolio marketing strategy. You can reach Paul on Twitter: @PaulD360.
To effectively compete in today’s changing world, it is essential that companies leverage innovative technology to differentiate from competitors. Learn how you can do that and more in the Smarter Computing Analyst Paper from Hurwitz and Associates.