JProfiler: trying to find memory leak
My application needs about 10GB of RAM for a specific input, where for normal inputs, about 1GB is sufficient. A closer analysis with JProfiler shows that (after GC) quite a lot of memory is used by the standard classes from java.util.*
:
LinkedHashMap$Entry
, HashMap$Entry[]
, LinkedHashMap
, HashMap$KeySet
, HashMap$EntrySet
, LinkedHashSet
, TreeMap$Entry
And TreeMap
(in this order) and related classes. The following entry refers to a class in my own code where the number of instances and the amount of memory used seems to be very reasonable.
In detail, with a total heap of about 900 MB, I see the following entries Size
in the view All Objects
:
-
LinkedHashMap$Entry
: 418 MB -
HashMap$Entry[]
: 178 MB -
LinkedHashMap
: 124 MB -
HashMap$KeySet
: 15 MB
The memory in use LinkedHashMap
seems to be too high, even considering that each is LinkedHashSet
supported LinkedHashMap
.
I recorded the selection of objects in JProfiler and watched Allocation Hot Spots
for LinkedHashMap
. There I see entries that I don't understand:
- The third entry shows a hotspot (with 6.5% memory allocated) named
X.<init>
whereX
is a class in my own code. The constructors for this method have nothing to do withLinkedHashMap
. After this entry,Thread.run
at the end, a slow decline is shown from 6.5% to 5.8% atThread.run
. What is the problem with my code inX
? Why is this shown here? - About 8% of the allocated memory is mapped to a named hotspot
java.util.HashSet.iterator
. After this entry along the path with the highest percentage (first entry: 2.8%), I get several methods inside my code until finally it is shownjava.lang.Thread.run
(with 2.8%). What does it mean? As far as I know, the methodThread.run
does not create instancesLinkedHashMap
. What is the relationship with the methoditerator
?
In general, how do I find code that contains references to (many) LinkedHashMap
objects? Using Heap Walker, I can only see a lot of instances, but I can't see any pattern (even when watching the paths to the GC roots). In my experiments, all instances appear to be fine.
Possibly important things:
- My application creates a result (for further processing) and there is high memory for this construct. The design is constantly creating objects, so waiting for a stable point and then observing each created object is
LinkedHashMap
impossible. - I have good computers for debugging (up to 48 cores and 192GB of RAM, maybe even more).
- java version "1.7.0_13" (Java (TM) SE Runtime (build 1.7.0_13-b20), Java HotSpot (TM) 64-bit Server VM (build 23.7-b01, mixed mode))
- JProfiler 7.2.2 (Build 7157) Licensed
source to share
In general, how do I find code that contains references to (many) LinkedHashMap objects?
In the running compartment, select "LinkedHashMap" and create a new set of objects. Then switch to the Links view and show the Cumulative Inbound Links . There you can parse links for the entire set of objects.
Regarding your question about allocation hotspots and why the Thread.run method is shown: these are backtraces , they show how the hotspot was triggered and all the numbers on the nodes contribute to the hotspot at the top. The deepest node will always be the entry point already, usually the Thread.run method or the main method.
source to share