1,000 GigaBytes.
A stupidly large amount of data.
Areas where it's common to deal with datasets of such dimensions are rendering farms, weather services, car crash simulations, nuclear chain reaction simulations, and generally any scientific simulation. These commonly run iterative computations on a very large three dimensional model digitized as a high (spatial) resolution grid. A single iteration typically produces many GigaBytes of data, and the iterations are usually run at high (temporal) resolution. As a result, you quickly have to deal with ridiculously huge piles of data.