Penguin
Annotated edit history of TeraByte version 2, including all changes. View license author blame.
Rev Author # Line
1 AristotlePagaltzis 1 1,000 [GigaByte]s.
2
3 A stupidly large amount of data.
4
2 AristotlePagaltzis 5 Areas where it's common to deal with datasets of such dimensions are rendering farms, weather services, car crash simulations, nuclear chain reaction simulations, and generally any scientific simulation. These commonly run iterative computations on a very large three dimensional model digitized as a high (spatial) resolution grid. A single iteration typically produces many [GigaByte]s of data, and the iterations are usually run at high (temporal) resolution. As a result, you quickly have to deal with ridiculously huge piles of data.