Penguin
Note: You are viewing an old revision of this page. View the current version.

1,000 GigaBytes.

A stupidly large amount of data.

Areas where it's common to pile up such amounts are rendering farms, weather services, car crash simulations, nuclear chain reaction simulations, and generally any high scientific similation. These commonly run iterative computations on a very large three dimensional model digitized as a high (spatial) resolution grid. A single iteration typically produces many GigaBytes of data, and the iterations are usually run at high (temporal) resolution. As a result, you quickly have to deal with insanely huge piles of data.