Can smarter computing improve public policy management?



 
 

In the US, the long campaign for President has begun in earnest with daily messages of doom and despair and a few proposed magic solutions to problems that have plagued us for years, if not decades. As a result, I’ve been thinking about public policy a lot recently. I am, by nature, an optimist and refuse to believe that things are as good as they can be nor as bad as they might be either. As I am also actively investigating smarter computing opportunities, I decided to combine my interests and look at the potential for smarter computing technologies – and the processes we use to tune smarter computing environments – to improve public policy management.

Cloud computing is perhaps the most logical place to look for quick results, and cloud has already found favor in the US government. In fact, in the February 28, 2011 Federal Cloud Computing Strategy report, Vivek Kundra, US CIO, estimated that “An estimated $20 billion of the Federal Government’s $80 billion in IT spending is a potential target for migration to cloud computing solutions.” We will assume that the benefits of cloud deployment are well understood, and focus on the other keys to smarter computing, big data/analytics and workload optimization.

Governments already offer data to the private sector to foster the creation of new commercial products and services. For example, the US Government makes data from the Global Positioning System available for civilian purposes – accurate to a “worst case pseudorange accuracy of 7.8 meters at a 95% confidence level.” That has enabled a range of services from the navigation system in automobiles to location-based services on mobile phones like FourSquare.

Many municipalities also treat data, including big data, as an asset that should be leveraged by the private sector. For example, New York City makes data available on electric consumption by ZIP code, demographic information, and 311 (information) service requests via its NYCOpenData website. In these examples government either produces or aggregates the data, but the analytics and leverage comes from the private sector.

The question for governments is how to leverage data in ways that create a virtuous cycle to improve policies? As we begin to see advances in areas like evidence-based medicine (e.g. diagnostic and treatment recommendations based on actual patient data including outcome data) how about evidence-based policy management?

Many advances in education (from research to testing), security (crime detection to prosecution) and healthcare/life sciences (diagnostics, treatment, clinical trials…) are dependent on big data/analytics. For healthcare and security in particular, the rate of data growth is likely to surpass the rate of processor improvement, so workload optimization will be critical.

I chose education, security, and healthcare as proof points because they are areas of personal interest but I believe the principals generalize to most areas where governments must deal with complex decisions regarding allocation of resources.

For education and healthcare, the goals are to improve availability to more of the population, improve outcomes, and manage costs. For security, the goals include better systems for crime detection (and ultimately prevention), prosecution, and offender management. Each of these domains generates large volumes of data that are currently being underutilized in policy management, but which could be used to improve the allocation processes.

In the figure, we show a four-step cycle appropriate for these government investments. The frequency of iteration might be fixed (e.g. evaluate annually or monthly) or a review could be triggered by an event or some value in the data (perhaps analytics reveal a downward trend in mortality linked to some government sponsored initiative, or an increase in crime that could be offset by additional investment or reallocation of resources).

Smarter computing leverages analytics and workload optimization to drive continuous improvement. If there’s a domain more in need of continuous improvement than government, I haven’t seen it. The good news is that, at the city level at least, leaders are taking notice. From Dubuque, Iowa where these technologies and processes are improving water, energy and traffic management to Corpus Christi, TX, Rio de Janeiro, Cambridge, Ontario and others where infrastructure and asset management are now being optimized using these techniques, there is cause for hope and celebration. And, while large nations have yet to fully embrace these techniques, they can look to islands of innovation like Malta – which is linking power and water systems in a common grid to support better resource management with variable pricing while shifting responsibility and control to the consumers/citizens – for a glimpse of the future.

 
 
Smarter Computing Analyst Paper - HurwitzTo effectively compete in today’s changing world, it is essential that companies leverage innovative technology to differentiate from competitors. Learn how you can do that and more in the Smarter Computing Analyst Paper from Hurwitz and Associates.

Subscribe to the Smarter Computing Blog

Recent Posts

Enabling progress through cloud, mobile and analytics

Deon Newman

It is an exciting time for mainframe computing, with its built-for-cloud architecture, a foundation that enables cost efficient and secure mobile transactions and the ultimate analytics engine for instant insight. Several exciting announcements came out of the Mainframe50 event.

Continue reading

Public, private and dynamic hybrid cloud: What’s the difference?

Leone Branca

The economic advantages of utility computing, better time-to-market and flexibility are driving more and more businesses to move their critical systems into the third-party infrastructures of cloud providers. So what makes each cloud model different?

Continue reading

One Response to Can smarter computing improve public policy management?

  1. Kelsey says:

    Great article Adrian! I’m always amazed at the smart things IBM customers are doing, but the stories that I have a natural affinity for are the ones in healthcare, government and education too. I think about organizations like Rice University (http://bit.ly/AccvAA), who are optimizing their systems to conduct cancer research 26x faster or SETMA (http://bit.ly/wtZAwZ), who are using analytics to identify healthcare risk factors before they become a sad reality. Thanks for sharing your insights.

Leave a Reply

Your email address will not be published. Required fields are marked *

* Copy This Password *

* Type Or Paste Password Here *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>