[dropcap]O[/dropcap]ver the past 50 years, computing has come a long way. From simple integrated circuits of the 60s to the VLSIs of the 90s; multi-core processors to processor clusters; parallel computing, networked and distributed computing and all the way to computer grids and computer clouds.
But what next? Welcome to Jungle Computing!
The Ever Increasing Need of Computing Power
Imagine modelling the climate – an inherently chaotic system with a huge number of variables, their inter-dependencies, sensitivities and uncertainties. Collecting relevant data on a global scale, managing it, processing it, solving loads of equations necessary for acceptable forecasting accuracy in almost real time, is a task that runs shivers down the spine of even the most muscular computational facilities. The hunger for computational power is as ever insatiable as the computational tasks are ever boundless.
Jungle Computing to the Rescue
How to handle climate modelling and other similar tasks?
The solution is to use all the available computing and processing muscles simultaneously, i.e. use a large number of diverse, distributed, high performance computing resources to work in unison along with multi-core GPUs, FPGAs, and more traditional supercomputers; all connected using high speed networks. This is what is termed as Jungle Computing.
Jungle Computing essentially relies upon the ability to use multiple diverse platforms and systems simultaneously. Whether it is climate modelling, drug designing, remote sensing, astronomy or neuro-informatics, the jungle welcomes them all.
It is expected that the new beast known as the ‘Big Data‘ will have many of its species and relations habituating in the jungle.
Vive la jungle!