Author Archives: Nagui Halim

Nagui Halim

About Nagui Halim

Nagui Halim is an IBM Fellow and currently director of InfoSphere Streams in Software Group's Information Management organization. Nagui has spent most of his 30-year career with IBM in the Research Division in a series of positions, from software engineer, research staff member, and manager to senior manager, department group manager, and director. His technical areas of expertise include systems software, operating systems, transaction processing, fault-tolerant computing, distributed systems, programming languages, computer communications, and computer architecture. Among his contributions, the most important are in stream computing and large systems clustering. Nagui Halim's technical vision and leadership launched the era of stream computing at IBM. In response to a Grand Challenge posed by the United States Department of Defense (DoD) in 2003 to create a new architecture for high-speed adaptive stream processing and analytics, Nagui recruited and assumed leadership of a large interdisciplinary research team, working in close and novel collaboration with DoD, to undertake this formidable project to develop a new type of computing system able to manage and analyze massive volumes of continuous streams of data, which became known as System S. As the technical lead on System S, Nagui developed the foundational concepts and designed the architecture for this new computing system, led the prototype effort, led the formal product development, and most recently is leading the commercialization effort and client enablement in numerous commercial segments with the system now in production. Both the US and UK Departments of Defense have extensive deployments of this technology in their operations world-wide. In previous years Nagui played a pivotal role in the development of the System/390 Parallel Sysplex in a joint initiative with Enterprise Systems (ES) LOB, leading the Coupling Facility Control Program (CFCP) design and engineering effort. The CFCP was a hardware and software facility that turned an IBM mainframe Base Sysplex system into a Parallel Sysplex, a clustering architecture that allowed multiple smaller computers to operate in concert as one large image, while enabling data sharing with guaranteed integrity, extensive resource sharing, workload balancing, and continuous availability.
Smarter Computing Breakthroughs

Stream processing: analyze data in real time for a more agile response

Stream processing (if implemented effectively) can empower organizations to get insights from data far more quickly, and in far more ways, than ever before — even in cases where insights are needed in real time, or very close to it.

Continue reading