How to write huge data in text file using JDBC without running out of memory

I am trying to create a dump file from a database using JDBC. The file should be about 300mb in size, containing 1.2 to 1.5 million entries in ten columns, but I am running out of memory around 250k.

My question is, does java keep the entire recordset in memory? I set the recordset to be read only forward in the hope that the records that were flushed would be removed from memory, but that doesn't seem to be the case.

Any help would be greatly appreciated.

+2


source to share


4 answers


You must use setFetchSize

for an object Statement

. The following example will only fetch 1000 records at a time from ResultSet

:

Connection con = DriverManager.getConnection("jdbc:my_subprotocol:my_subname");
Statement stmt = con.createStatement();
stmt.setFetchSize(1000);
ResultSet rs = stmt.executeQuery("SELECT * FROM your_table");

      



Here's the Javadoc:

http://java.sun.com/javase/6/docs/api/java/sql/Statement.html#setFetchSize(int)

+7


source


If you fetch all the data and then write to a file as a second step, there will be some point where all the db data is in memory. If you transfer data to a file, this will not happen. that is, instead of

data=...
while (rs.next()){
  ... add rs value to data
}
..write data to file

      



something like

file=...
while (rs.next()){
  write rs to file
}

      

+1


source


The buffer / deallocation behavior depends on the JDBC driver as well as the JVM garbage collector. Are you doing any buffering yourself before writing to the FileOutputStream (this might interfere with your garbage collection)?

Try to adjust the Java heap size with arguments -Xms

and -Xmx

java runtime. Example:

java -Xms1024M -Xmx1024M com.mypkg.MyResultSetReader

The above command will make 1 GB of heap space available to your memory. If this is a temporary tool, or you don't expect the ResultSet to grow in size, this might work as a permanent solution.

0


source


Setting fetchSize should help. But it really depends on the jdbc driver. But you can read data iteratively (in small chunks):

stm = conn.prepareStatement("...where id > ? order by id");
stm.setMaxRows(100);

while(true) {
    stm.setInt(1, lastId);
    ResultSet results = stm.executeQuery();

    // process results and assign a new value to lastId 

    rs.close();
}

      

0


source







All Articles