Growth in computer power is being dwarfed by a tremendous increase in data. Genome sequencing data alone has increased by more than 105 percent in less than six years. Large-scale simulations are producing petabytes and even exabytes of data that simply cannot be handled effectively or in a reasonable time by traditional methods. The task of managing this data tsunami is one of the most pressing challenges of our time.
At Argonne, we are tackling this challenge by developing new techniques and tools for storing, transferring, accessing, visualizing, and analyzing extremely large datasets. Our aim is to enable researchers to make use of their “big data” for “big science,” leading to new discoveries and breakthroughs.