OutOfMemoryError java jdbc prepareStatement GC upper limit exceeded

I am writing a tool used to migrate data from an old schema to a new schema in an oracle database. In the meantime, there is no need to know about it. ”

There are about twenty tables in my database. Only two of them are big, they can have four million records. Others are small (maybe ten thousand or one hundred thousand).

Now I use one thread to process all the small tables sequentially and divide the big table into chunks, create a thread and use one process thread, each chunk is a million records.

Now I have some problems. When I run the program, everything is fine. But when my program starts after a while, I get some error information:

Exception in thread "Thread-8" java.lang.OutOfMemoryError: GC overhead limit exceeded
at oracle.jdbc.driver.OracleBlobInputStream.needBytes(OracleBlobInputStream.java:168)
at oracle.jdbc.driver.OracleBufferedStream.readInternal(OracleBufferedStream.java:178)
at oracle.jdbc.driver.OracleBufferedStream.read(OracleBufferedStream.java:147)
at oracle.jdbc.driver.OracleBufferedStream.read(OracleBufferedStream.java:137)
at oracle.jdbc.driver.BlobAccessor.getBytes(BlobAccessor.java:249)
at oracle.jdbc.driver.OracleResultSetImpl.getBytes(OracleResultSetImpl.java:714)
at oracle.jdbc.driver.OracleResultSet.getBytes(OracleResultSet.java:1625)
at datatransfer.processor.CProcessor.write(CProcessor.java:111)
at datatransfer.processor.Processor.process(Processor.java:77)
at datatransfer.thread.CThread.run(CThread.java:37)

      

I have checked my program, there is no closed loop and I do close statement

and resultset

.

Everyone Thread

has their own Connection

.

How can I check the reason why my memory is in the program? And is there a way to solve this problem?

    ResultSet rs = statement.executeQuery(sql);
    int count = 0;
    long start = System.currentTimeMillis();
    while(rsSrc.next()){
        preStatement.setString(1, rsSrc.getString(1)); 
        preStatement.setString(2, rsSrc.getString(2)); 
        preStatement.setString(3, rsSrc.getString(3)); 
        preStatement.setString(4, rsSrc.getString(4)); 
        preStatement.setString(5, rsSrc.getString(5)); 
        preStatement.setString(6, rsSrc.getString(6)); 
        preStatement.addBatch();
        count++;
        if (count % batchSize == 0){
            preStatement.executeBatch();
            preStatement.clearBatch();

        }
    }
    preStatement.executeBatch();
    preStatement.clearBatch();
    writeConn.commit();
    long end = System.currentTimeMillis();

      

statement

and preStatement

are created different Connection

, one is the old scheme, the other is the new scheme.

Is there something wrong with my code?

+3


source to share


3 answers


Try to analyze the creation of instances / objects in your RAM with jvisualvm, it usually tells you right away if you are leaking or not. (This is a GUI, don't panic ;-))

Doc -> https://docs.oracle.com/javase/6/docs/technotes/tools/share/jvisualvm.html

It's a profiler, so it will show you where you are spending your time, how many instances of your classes you have, and basically what happens in your application while it is running.



It is installed by default with the official oracle jdk on linux!

If the memory usage is nearly constant but at the edge, try increasing the heap (e.g. -Xmx2G)

0


source


One possible solution to the problem is simply to increase the size of the heap available to Eclipse. You can do this by opening the eclipse.ini file located in the eclipse installation folder.

After opening the file, you can add -Xmx2048M, which will give you 2GB of heap for your Eclipse.

This decision depends on how powerful your system is and how much of the whole heap you can give Eclipse.

For more information, click here...




Another way to deal with this problem is to try to handle small chunks (chunks) for large tables.


If you want to search deeper and find a specific cause for OOM, you can create a heap heap (or multiple heap heaps) and analyze it using http://www.eclipse.org/mat/ , which are free from SAP and IBM. This is a really powerful tool.

0


source


For my instance, I had to strip the hundrets from thousands of INSERT

statements from Matlab to the database. I also had a GC overhead exception:

java.sql.SQLException: java.lang.OutOfMemoryError: GC overhead limit exceeded

      

My solution was to close the db connection every few thousand (in this case 2000) INSERT snapshots, delete and clear the object from the Matlab workspace. Of course, to subsequently open a new connection.

classdef MySqlService < handle    
    properties
        db;
        counter = 0;
        dblimit = 0;
    end    
    methods
        function x = executeQuery(obj, query)
            obj.counter = obj.counter + 1;
            if (obj.counter > obj.dblimit + 2000)
                obj.dblimit = obj.counter;

                delete(obj.db);
                clear obj.db;                
                import lib.queryMySQL.src.edu.stanford.covert.db.MySQLDatabase;
                obj.db = MySQLDatabase('localhost:3306', 'fani_dev', 'root', 'dev1');
            end
            obj.db.prepareStatement(query);
            x = obj.db.query();
        end
        function obj = MySqlService()
            import lib.queryMySQL.src.edu.stanford.covert.db.MySQLDatabase;
            obj.db = MySQLDatabase('localhost:3306', 'fani_dev', 'root', 'dev1');
        end
    end    
end

      

Script works seamlessly. Also CPU workload and RAM usage looks good.

0


source







All Articles