Data TsunamiThe coming data tsunami (aka ‘Big Data) threatens to swamp enterprises that are ill-prepared to manage and analyze massive data sets. Indeed even terms used to describe the quantity of data – gigabytes and terabyes, are rapidly giving way to petabyte, exabytes, and zetabytes (1 ZB = 1021 bytes).

Petaflop/exaflop computing processing speeds and storage technologies are enabling companies to explore new business models such as in Life Sciences with genomics and personalized drug dosing.

A multitude of information technologies are increasingly used by CIOs in efforts to stay afloat in a sea of data. For example, Enterprise Data Warehouse solutions (EDW) have become major growth businesses for firms like Teradata. In the past year, cloud services and distributed computing technologies offer enterprises an ability to leverage scalable compute and storage resources on a virtualized basis to help address this issue. With cloud-based storage and compute services, the notion of owning IT infrastructure (i.e., servers and data centers) may seem quaint if not somewhat bizarre in coming years. Yet with all the promise of cloud computing, looming large as a potential barrier are performance inadequacies of typical enterprise networks and public internet connections. These networks lack sufficient bandwidth to fully unleash the true potential of cloud computing technology.