Penguin
Annotated edit history of ReadAhead version 1, including all changes. View license author blame.
Rev Author # Line
1 AristotlePagaltzis 1 A strategy to reduce the overhead of caching.
2
3 It increases the likelihood of data being in [Cache] by the time it is needed by reading more data than actually requested from the cached backend on cache miss. This is (perhaps surprisingly) effective because truly randomly spread accesses only usually happen in tightly confined areas (like a function's local variables on the stack, where the [CPU] memory [Cache] is concerned), while accesses outside cached areas tend to be long, sequential reads or writes.
4
5 The [Cache] may even serve the requested piece of data while the rest of the ReadAhead is still pouring in, reducing the latency of cache miss responses.