Penguin

Differences between current version and previous revision of TeraByte.

Other diffs: Previous Major Revision, Previous Author, or view the Annotated Edit History

Newer page: version 2 Last edited on Thursday, October 30, 2003 11:48:11 am by AristotlePagaltzis
Older page: version 1 Last edited on Thursday, October 30, 2003 8:11:52 am by AristotlePagaltzis Revert
@@ -1,5 +1,5 @@
 1,000 [GigaByte]s. 
  
 A stupidly large amount of data. 
  
-Areas where it's common to pile up such amounts are rendering farms, weather services, car crash simulations, nuclear chain reaction simulations, and generally any high scientific similation . These commonly run iterative computations on a very large three dimensional model digitized as a high (spatial) resolution grid. A single iteration typically produces many [GigaByte]s of data, and the iterations are usually run at high (temporal) resolution. As a result, you quickly have to deal with insanely huge piles of data. 
+Areas where it's common to deal with datasets of such dimensions are rendering farms, weather services, car crash simulations, nuclear chain reaction simulations, and generally any scientific simulation . These commonly run iterative computations on a very large three dimensional model digitized as a high (spatial) resolution grid. A single iteration typically produces many [GigaByte]s of data, and the iterations are usually run at high (temporal) resolution. As a result, you quickly have to deal with ridiculously huge piles of data.