Size, Speed of Data Transfer Increasing

Indiana University (IU) is looking to widen lanes and raise the speed limit on the information superhighway — it recently introduced the Data Capacitor, a file system designed to store and manipulate large data sets.


The Data Capacitor has a single-client-transfer rate of 977 megabytes per second across the TeraGrid network, which is an open scientific discovery infrastructure that combines large computing resources at nine sites (such as supercomputer centers and universities) partnered with the National Science Foundation to create a geographically diverse computational resource.


Work on the Data Capacitor began with a grant of $1.72 million from the National Science Foundation in late 2005. Steven Simms, Data Capacitor project leader, said the idea behind the system was to create a facility that would do three things: provide large storage, provide fast storage and provide researchers with a way to find large data sets after transfer.


“The premise is that digital instruments these days, including machines that are producing simulation data, produce that data at an alarming rate,” Simms said. “I like to call it the ‘data fire hose.’ If you’re going to capture the data, that means you’ve got to be able to ingest that data quickly, and if your simulations are running for a long time, you’ve got to have hundreds of terabytes of space to accommodate multiple streams of this kind of data from different departments. So, we set down this path: We started mounting the file system on multiple locations, tying local…


cmadmin

ABOUT THE AUTHOR

Posted in Uncategorized|

Comment:

Powered by WebDesk