Honey, I Shrunk the Kids was an action-comedy film produced by Walt Disney Pictures in the late 1980s. As a child, I enjoyed the movie and watched it several times. This is not to say that I have not enjoyed watching it as a grown-up too. The story goes like this: a scientist father accidentally shrank his and neighbor’s kids to a quarter of an inch with his electromagnetic shrink ray. The kids endured a series of adventures before returning to their normal size.
A few months ago, I indulged myself in having some nice food and watching this movie again. Immediately after the movie, something clicked in my mind. Over the last fifty years we have seen unprecedented improvements in computing and at the same time a reduction in size of computing equipment. While the reduction in size was not due to some “accidental electromagnetic shrink ray,” I still wanted to use the title of the movie for my blog and show how technological advancements over time have resulted in more computing power with a smaller form factor.
Evolution of computing systems
Computing systems are becoming smarter and more efficient day by day. These smarter computing systems, along with advanced communication systems, are paving the way for smarter computing. As we know, smarter computing is a journey, not a destination. Over time it will evolve to cater to the demands and needs arising from a multitude of factors like social and market trends, service-oriented architecture, Web 2.0, exponential growth in connected devices, collaboration and social networking. I will provide a few examples here to show how the capacity of devices has improved over time. Moore’s law states that the number of transistors on integrated circuits (IC) doubles approximately every two years (some quote it as 18 months). Moore’s law has left a phenomenal impact on the semiconductor industry, and developments in this field have been in resonance with the prediction made by Gordon Moore.
Before I start my main discussion, I would like to discuss briefly the different phases of computing since early 1960s. I presented six phases of computing in an earlier blog post, “Darwin’s theory of evolution or the Big Bang: which applies to cloud computing?” IBM has been at the forefront of computing technologies starting from the inception of mainframe in the 1960s. Phase one defines the early days of mainframes. In this phase many people shared powerful mainframes by dummy terminal. IBM was the pioneer in offering mainframe services through time sharing in the 1960s. Phase two has seen the rise of personal computers (PCs). There was a shift from mainframe to PC. PCs were smaller but powerful enough to serve some customers’ needs. In phase three, people started to connect their computer within a local area network to share resources. In phase four, these local networks were interconnected to create a global network. This heralded the birth of the Internet. In this phase users utilized this Internet to access remote applications and resources. The Internet became a commodity for business success. Any tech-savvy person will know about this dot-com bubble. In phase five, we have seen the birth of grid computing, predecessor of cloud computing. Computer power and storage were shared though distributed computing. Phase six introduced cloud computing to us, tapping on the technological advancements over the last 50 years.
How did we reach to the juncture where we are today? It is obviously the evolution of the computing devices over the years. Research and development by industry and academia have contributed to this. It is amazing to see the enormous computing power that we have at our fingertips. The smartphone that I am using today has more processing and storage capacity than my Pentium 1–based PC. Based on the size of the cathode ray tube (CRT) monitor and casing, my Pentium 1 PC was a thousand times bigger in form factor. However, this PC was equipped with a processor of 133 MHz in clock rate. Whenever I think about it and realize that my smartphone has 1.5 GHz dual core, I become amazed by the improvement in technology.
Does size matter?
The other day, a 15-year-old cousin of mine burst into laughter when he heard about my Pentium 1 PC with a clock rate in the MHz range. That’s right. My smartphone can do more than what I was able to do with my Pentium 1. I can read books; I can browse the Internet; I can control my entertainment system and TV; I can monitor my home from anywhere in the world with it. I can store more MP3 songs in the storage offered by my phone than I could do with my bulky PC. I can even control my laptops and PCs using my phone remotely. The possibilities are endless. We have come a long way.
These days we are demanding more and more computing power, storage and network bandwidth. Obviously a smartphone cannot cope with the requirements of data-crunching tasks. However, it can act as a thin client to a cloud back end, which can perform the heavy lifting of computing-intensive tasks, process the results and send the results back to mobile and handheld devices for rendering the results in a graphical user interface (GUI). The bottom line is that the physical footprint for computing systems is getting smaller and smaller. We are seeing inventions that we could not imagine before. This reminds me of a quote by Henry George, renowned American writer, politician and political economist:
“The march of invention has clothed mankind with powers of which a century ago the boldest imagination could not have dreamt.”
Compacting more into less
I could provide hundreds of examples here demonstrating technological evolution. Let me start with providing one related to tape storage. IBM has been the leading manufacturer and pioneer of this technology. IBM 726 Tape unit, the first tape system in the world, introduced in 1952, had a capacity of 1.4 MB and a data rate of 10 KB/s. Now fast forward to 2006. One TS1120 Tape has a capacity of 8 TB, offering 1033 gigabits per square centimeter. Since then, a 35 TB Tape system has also been introduced.
The image at the beginning of this post shows an IBM Microdrive with a chick for the purpose of comparison. It was the smallest hard disk drive as of 1999. This microdrive of the size of a postage stamp was first invented by IBM in 1999 and offered a capacity of 170 MB. This capacity grew to 8 GB by 2006 (source). It is really fascinating to find technologies like this profoundly changing the IT landscape. Flash technologies, solid state disks and other similar storage technologies are offering more capacities day by day. Since these devices do not have rotating mechanical parts like hard disk drives they are more energy efficient and smaller in size.
Doing more with less
Since the inception of the mainframe, IBM has been dominating the mainframe arena. With mainframe more computing capacity can be packed in a smaller footprint. Worldwide data centers have been challenged with floor space, and some data centers are running out of it. Mainframe technology can consolidate the workload of hundreds of servers into one mainframe, and it has started to become popular in the cloud computing domain. Cloud implementation with mainframe can offer a huge reduction in operating costs and significant improvement in performance and efficiency. For example, IBM’s System z mainframe can simplify a data center by reducing its components by 90 percent with its unique virtualization capability and design principle. The data center of the future will shrink down with smarter technology like this. IBM’s new zEnterprise EC12 offers 50 percent more capacity than its predecessor. Thousands of distributed systems can be consolidated onto a single zEnterprise EC12. The value propositions are reduced floor space and software costs, improved efficiency and productivity gain. I am going to stop here, although I could keep going with examples.
As you can see, these smarter and more efficient blocks are the ingredients of smarter computing. I am glad that mad scientists and researchers have been continuously working to make computing devices smaller and smarter. This development will continue, and the world will be our oyster with technological inventions. I hope, like me, you won’t mind if the crazy scientists keep shrinking the devices to pack more capacity into a smaller form factor. Whenever I watch the movie Honey, I Shrunk the Kids, it will remind me of these technological evolutions.
Shamim Hossain is an experienced technical team leader and project manager leading a number of complex and global projects with involvement in the full project lifecycle ranging from planning, analysis, design, test and build through to deployment. He is an IBM Certified Cloud Solution Advisor and Cloud Solution Architect. You can reach him on Twitter @shamimshossain.
To effectively compete in today’s changing world, it is essential that companies leverage innovative technology to differentiate from competitors. Learn how you can do that and more in the Smarter Computing Analyst Paper from Hurwitz and Associates.