The hybrid face of future transaction processing



 
 

Transaction processing is considered by many a very dusty topic. Maybe it is, but it runs the business, and many organizations realize that. They recognize that their transaction processing capability provides a distinguishing business value and hence a key competitive advantage. In this blog post I will look at transaction processing systems and examine technology advances that address the future need for more complex transaction processing at higher throughputs.

New value from transactions

The massive increase in computing things—appliances, sensors, devices, tablets, smartphones and so on—produces large volumes of data. Social networks and mobile technologies are changing the way businesses create value. Hybrid Transaction ProcessingAt the same time, new economies and massive new consumer markets are emerging. Organizations realize that finding new value in all this data will open up opportunities to serve new customer needs and address new markets.

To extract business value from such a huge volume of data and serve these new markets, businesses need a high capacity and new systems that integrate traditional transaction processing capabilities with deep analytics. These new systems will allow organizations to take action from events in near real time, instead of a week or a month after the event, and thus to become more responsive to trends anywhere in the world as they develop. Through the marriage of transactions and deep analytics, new services can be provided to new customer segments, fulfilling a demand that is currently ignored or recognized too late.

Transactions transformed

Transaction processing systems are being dusted off. These systems are not just critical for back-end reconciliations, but their near real-time transactions can transform how customers, employees, businesses and partners influence each other in deciding what to offer and what they did not even know they needed.

Hybrid business transactions

Business transactions are increasingly becoming hybrid transactions, in terms of the businesses involved, and in terms of the supporting information processing capabilities involved. Hybrid business transactions comprise interactions with diverse systems: from self-managed, on-premises IT facilities to cloud-based software as a service (SaaS) services.

Hybrid applications

Hybrid information technology architectures are also being realized on various levels. On the application level, architectures have evolved into composite architectures and are built upon loosely coupled components that are deployed in a geographically dispersed landscape, and that comprise cloud and on-premises IT services. In this landscape of composite applications, the supporting middleware solutions, including transaction processing facilities, must support the business processes that span the hybrid middleware.Hybrid Applications

Furthermore, the middleware itself is supporting more business flexibility. Business events interact with transactions. Business rules are extracted from transaction systems and now managed by the business, making it easier to make changes to rules and ease compliancy. Incorporating more complex analytics in business transactions lowers business risk and enables better customer service.

Hybrid computing architectures

Computing architectures are becoming workload optimized and more heterogeneous. The hybrid nature of server architectures is realized through a tighter integration of scale-up and scale-out architectures. Such an integrated operating environment allows diverse multitier workloads to be run on the hardware architecture that is the best fit for the task.

But transactions will be demanding more computing throughput. Emerging technologies to address this are to increase the use of parallel algorithms and employ special hardware called accelerators. Solid state technology developments will augment existing memory and storage architectures and enable an increase in processing power from that perspective.

Microprocessor developments are continuing to evolve, and transaction processing workload will benefit from accelerators on the chips like coprocessors for encryption and compression. A notable feature especially for transaction processing systems in the latest chip designs is transactional memory. Transactional memory is a hardware technique to support parallel programming that is a generally more efficient alternative to the use of locks in managing access to shared resources by parallel threads. Instead of using locks, with transactional memory the state of the shared resource is checked before and after an atomic operation. If there is no change to the resource, the operation is committed; if there is a change to the resource, the atomic operation is backed out and retried.

Special CPUs are designed to speed up transaction processing. The application of Field Programmable Gate Arrays (FPGAs) is one of these technologies. An FPGA is a hardware component that can be programmed for a specific task that can then be very quickly and efficiently executed. FPGAs are used today in the DB2 Analytics Accelerator and PureData System for Analytics.

Direct memory addressable accelerators like the General Purpose GPU are special processors that execute in parallel with the central processor to accelerate a special function and have direct access to the memory shared with the central processor. Such processors will help to manage the big data challenge by extracting patterns from raw multimodal data (image, sound, video, speech and so on).

Compilers will be enhanced to take into account new and improved hardware capabilities, such as transactional memory, simultaneous multithreading and multimodal accelerators.

These advances in server technology will allow transaction processing systems to accelerate a wider range of functions and thus address the more complex processing requirements and higher transaction throughput requirements.

Acknowledgement: Much of this article is reworked and updated from information contained in the IBM Redguide “Transaction Processing: Past, Present, and Future.”


Niek de Greef is an Executive IT Architect working for IBM in The Netherlands. Niek has more than 20 years of experience in IT. His areas of expertise include technology strategy, enterprise architecture, application integration, software engineering, and infrastructure architecture. You can reach him on Twitter @NdeGreef1.

Redbooks Thought Leader

 
 
Smarter Computing Analyst Paper - HurwitzTo effectively compete in today’s changing world, it is essential that companies leverage innovative technology to differentiate from competitors. Learn how you can do that and more in the Smarter Computing Analyst Paper from Hurwitz and Associates.

Subscribe to the Smarter Computing Blog

Recent Posts

Enabling progress through cloud, mobile and analytics

Deon Newman

It is an exciting time for mainframe computing, with its built-for-cloud architecture, a foundation that enables cost efficient and secure mobile transactions and the ultimate analytics engine for instant insight. Several exciting announcements came out of the Mainframe50 event.

Continue reading

Public, private and dynamic hybrid cloud: What’s the difference?

Leone Branca

The economic advantages of utility computing, better time-to-market and flexibility are driving more and more businesses to move their critical systems into the third-party infrastructures of cloud providers. So what makes each cloud model different?

Continue reading

Leave a Reply

Your email address will not be published. Required fields are marked *

* Copy This Password *

* Type Or Paste Password Here *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>