Comp Sci: terminology: what is a buffer?

The tutor finally defines what a buffer means in computer science; he has long wondered.

I’ve always had a notion that a buffer means an allocation of memory (RAM) devoted to specific purpose (process). Maybe the process is using the buffer, maybe not – but it’s available to the process when needed. The buffer itself is finite, but probably sufficient for instantaneous purposes. Overflow goes to disc, which is slower. That’s always been my conception of a buffer in comp sci.

Looking up the term, I’m apparently right – but not so articulate as Perchik, the contributor to stackoverflow.

The advantage of a buffer arises from the fact that every call to the data on disc is expensive. Therefore, rather than call (the OS or other resource) for the data one character at a time, it’s more efficient to request a large chunk of data once in a while (which is stored in the buffer until needed).

Buffers can be used, for example, loading graphics, reading files, or downloading files.

Source:

docs.oracle.com

Jack of Oracle Tutoring by Jack and Diane, Campbell River, BC.

Leave a Reply