How to reduce java heap usage when reading 50MB files?

I am developing a web application that reads very large files> 50MB and displays it. The Java Spring backend will read these files and its content using CXF. My problem is that after reading a 50 megabyte file, the used heap size increases by 500 megabytes. I was reading these fils as String and this String will be posted to the frontend. Are there any ideas how you can reduce Java heap usage? I tried nio, Spring Resource class but nothing helped.

+3


source to share


1 answer


The dirty way to do this is for the Spring method to @Controller

take an argument OutputStream

or Writer

- the framework will provide a raw HTTP response output stream and you can write directly into it.This wraps all the nice content type management logic, etc.

The best way is to define a custom type that will be returned from the controller method and the corresponding one HttpMessageConverter

, which (for example) will use the information in this object to open the corresponding file and write its contents to the output file stream.



In both cases, you won't be reading the file into RAM; you will use a single fixed size buffer to transfer data directly from disk to the HTTP response.

+4


source







All Articles