Analytic applications and the need for a resilient infrastructure



 
 

Guest post by Jean Bozman, Research Vice President, Enterprise Servers, and Dan Vesset, Program Vice President, Business Analytics, IDC

In our first installment we discussed infrastructure optimization as the key to big data success.

In a recent IDC survey of over 2,500 organizations, 22% of respondents indicated that if their business analytics solution were out of service for up to one hour, it would have a material negative impact on business operations. That’s why IT groups are paying even more attention to resiliency around analytics deployments.

Resiliency is defined by two key attributes: availability and flexibility. For production systems, enterprise workloads must be highly available because they may support hundreds, or even thousands, of end users accessing them. Although some analytic applications may be used by small groups of data scientists and business unit analysts, availability is still important to overall business success as the role of these analysts is increasingly critical in providing insight to other decision makers in the organization.

Building a more agile infrastructure gives IT organizations the flexibility to better support analytics workloads. Having IT flexibility means that organizations don’t need to rip and replace infrastructure. Instead, they are more likely to add net-new servers to existing infrastructure or to reorganize servers into clusters, grids, or arrays that run analytics software.  (Read this IDC paper) for a fuller discussion on matching analytics workloads with infrastructure and client deployments examples.)

Manufacturing, healthcare, telecommunications, public sector, and other organizations that have higher competency and pervasiveness of business analytics solutions are defined by their focus not only on software functionality for information integration, monitoring, management, analysis, and visualization, but also on innovating the hardware infrastructure that enables successful business analytics projects and ongoing programs.

These organizations place a premium on the optimization and resiliency of server, storage, and network infrastructure to address expanding volumes of multi-structured data, users, and use cases.

 
 
Smarter Computing Analyst Paper - HurwitzTo effectively compete in today’s changing world, it is essential that companies leverage innovative technology to differentiate from competitors. Learn how you can do that and more in the Smarter Computing Analyst Paper from Hurwitz and Associates.

Subscribe to the Smarter Computing Blog

Recent Posts

Answering the call for a new generation of systems

Doug Balog

IBM’s new generation of Power Systems are tuned for Linux, designed for data and optimized for cloud. The new POWER8 processor is at the very core and stands at the heart of the OpenPOWER Foundation, enabling unique, community-created innovation.

Continue reading

Enabling progress through cloud, mobile and analytics

Deon Newman

It is an exciting time for mainframe computing, with its built-for-cloud architecture, a foundation that enables cost efficient and secure mobile transactions and the ultimate analytics engine for instant insight. Several exciting announcements came out of the Mainframe50 event.

Continue reading

Leave a Reply

Your email address will not be published. Required fields are marked *

* Copy This Password *

* Type Or Paste Password Here *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>