Understanding when to buffer handlings in java and when NOT

So I'm studying my upcoming Java exam and in the tutorial one of the short answer questions is about when to use a buffer and when not to. I did a fairly comprehensive search on this topic and didn't come up with anything specific, so I figured I'd ask for my own benefit, as well as others who might have a similar question.

From what I've gathered, using a buffer is generally preferred as it is less resource intensive than reading from disk byte-for-byte (since files are read from disk without a buffer). Now, my question is when does NOT prefer to use a buffer when processing files in java? My best guess would be if the file was extremely small, which required the buffer to be somewhat redundant, but I'm not entirely sure about that.

Also, a quick rundown of what buffering actually is would be awesome (another short answer question). I read that it is just a space in memory where the data that is read / written is stored in large chunks, as provided directly on disk. Is this the correct description? Perhaps too controversial? A clarification here would also be awesome. Thank:)

+3


source to share


1 answer


Buffering happens between one type of media (that is, RAM, which is fast but limited in size) and another (the hard drive is large, but since all things have two sides, slow)

It doesn't really matter for the hard disk controller to write one byte or kilobyte. So, instead of doing, say, 1024 entries of 1 byte each, it is much faster to do one entry of 1024 bytes.

I think using a buffer can be inefficient (even critical) when you need to persist in persisting data as quickly as possible without waiting for the buffer to fill, for example to write some kind of log (I know of cases where there was a problem for debugging Linux panic. since the log was not completely written to the file, so the developer was unable to see what happened due to buffering).

Also the buffer takes up a space. Buffering is usually the preferred option. The only thing you need to decide is the size of the buffer. Even an unbuffered process can be thought of as a buffer of size 1 (or should I say zero?)



You can think of buffering as a concept / pattern for connecting media of different speeds. You are collecting data in a buffer and putting it on / from slow media at the same time. Since this is a slow operation, you can take this time to collect the next set of data. Ideally, the size of your buffer will match the bandwidth you want to archive. This asynchronous communication can significantly increase the transfer rate compared to unbuffered blocking operations.

Hope he can add some light to the topic.

ADDED: This entry will be inconsistent with no Wikipedia reference. And here it is: Data Buffer

+4


source







All Articles