Java VM suddenly exits for no apparent reason

I have a problem with my Java program exiting suddenly, without any exception, or exiting normally.

I am writing a program to solve Project Euler problem 14 . This is what I got:

private static final int INITIAL_CACHE_SIZE = 30000;
private static Map<Long, Integer> cache = new HashMap<Long, Integer>(INITIAL_CACHE_SIZE);

public void main(String... args) {
    long number = 0;
    int maxSize = 0;

    for (long i = 1; i <= TARGET; i++) {
        int size = size(i);
        if (size > maxSize) {
            maxSize = size;
            number = i;
        }
    }
}
private static int size(long i) {
    if (i == 1L) {
        return 1;
    }
    final int size = size(process(i)) + 1;
    return size;
}

private static long process(long n) {
    return n % 2 == 0 ? n/2 : 3*n + 1;
}

      

This works fine and finishes correctly after about 5 seconds using TARGET 1,000,000.

I wanted to optimize by adding a cache, so I changed the size method:

private static int size(long i) {
    if (i == 1L) {
        return 1;
    }
    if (cache.containsKey(i)) {
        return cache.get(i);
    }
    final int size = size(process(i)) + 1;
    cache.put(i, size);
    return size;
}

      

Now when I run it, it just stops (process exits) when I get to 555144. Same number every time. No exceptions, errors, Java VM crashes and nothing is thrown.

Changing the size of the cache also has no effect, so how can the cache introduction cause this error?

If I use the cache size not only as initial but also constant:

    if (i < CACHE_SIZE) {
        cache.put(i, size);
    }

      

the error no longer occurs. Edit: When I set the cache size to 2M, the error starts showing again.

Can anyone reproduce this and maybe even give a suggestion as to why this is happening?

+2


source to share


7 replies


It's just an OutOfMemoryError that doesn't get printed. The program works fine if I set the heap size to large, otherwise it exits with a non-fenced OutOfMemoryError (easy to see in the debugger).

You can test this and get a heap dump (as well as the printout of which OutOfMemoryError occurred) by passing this argument to the JVM and restarting your program:

-XX: + HeapDumpOnOutOfMemoryError



With that, it will then print out something with this effect:

java.lang.OutOfMemoryError: Java heap space

   Reset heap to java_pid4192.hprof ...
   Heap dump file generated [91901809 bytes in 4.464 seconds]

Increase your heap size with say -Xmx200m and you shouldn't have a problem - at least for TARGET = 1,000,000.

+8


source


It looks like the JVM itself is crashing (this is the first thought when your program dies anyway without a hint of an exception). The first step in this problem is to update to the latest version of your platform. The JVM should dump the heap to a .log file in the directory where you started the JVM if your user level has permissions to that directory.

That being said, some OutOfMemory errors are not reported on the main thread, so unless you try / catch (Throwable t) and see it, it's hard to be sure that you're not really just running out of memory. The fact that it only uses 100MB might mean the JVM is not configured to use more. This can be changed by changing the startup options on the JVM to -Xmx1024m to get a Gig of memory to see if the problem goes away.

The try catch code should look something like this:



public static void main(String[] args) {
     try {
         MyObject o = new MyObject();
         o.process();
     } catch (Throwable t) {
         t.printStackTrace();
     }
 }

      

And do everything in a process method and don't keep your cache statically, that way if an error occurs in the catch output, the object goes out of scope and can be garbage collected, freeing enough memory to allow the stack trace to be printed. No guarantees it works, but it gives him the best shot.

+3


source


One significant difference between the two implications size(long i)

is the number of objects you create.

The first implementation does not exist Objects

. In the second, you do a lot of autoboxing, creating a new one Long

for each access to your cache and adding more Long

and more Integer

for each modification.

This explains the increase in memory usage, but not the lack OutOfMemoryError

. Increasing the heap does this for me.

From this sunny aritcle :

Performance ... is likely to be bad as it blocks or unpacks on every get or set operation. It's fast enough for casual use, but it would be foolish to use it in a critical inner performance loop.

+1


source


If your java process crashes suddenly, some resource may arise. Like a memory. You can try setting a higher max heap

0


source


Do you see a bunch of dump generated after a crash? This file should be in the current directory for your JVM, where I will look for more information.

0


source


I am getting OutOfMemory error for cache.put (i, size);

To get the error, run your program in eclipse using debug mode, it will appear in the debug window. It doesn't create a stack trace in the console.

0


source


The recursive size () method is probably not suitable for caching. I placed a call to cache.put (i, size); inside main () for-loop and is much faster. Otherwise, I also get an OOM error (no more empty space).

Edit: Here's the source - retrieving the cache is of size (), but saving is done in main ().

public static void main(String[] args) {
    long num = 0;
    int  maxSize = 0;

    long start = new Date().getTime();
    for (long i = 1; i <= TARGET; i++) {
        int size = size(i);
        if (size >= maxSize) {
            maxSize = size;
            num = i;
        }
        cache.put(i, size);
    }

    long computeTime = new Date().getTime() - start;
    System.out.println(String.format("maxSize: %4d on initial starting number %6d", maxSize, num));
    System.out.println("compute time in milliseconds: " + computeTime);
}

private static int size(long i) {
    if (i == 1l) {
        return 1;
    }

    if (cache.containsKey(i)) {
        return cache.get(i);
    }

    return size(process(i)) + 1;
}

      

Note that by removing the call to cache.put () from size (), it does not cache every computed size, but it also avoids re-caching the previously computed size. This doesn't affect hashmap operations, but as akf points out, it avoids the autoboxing / unboxing operations that your heap killer comes from. I also tried "if (! ContainsKey (i)) {cache.put () etc. In () format, but unfortunately not enough memory either."

0


source







All Articles