Prevent hashmap from changing to double its current size
We have a one million HashMap. We have to store one million and 100 objects, but we do not want the HashMap to grow to a size (2 million) that doubles its current size (1 million) in just 100 objects.
EDIT: I want to optimize the hashmap resizing. Since we need to allocate a size of 1 million objects to store only 100 objects. so memory loss
How can we overcome this problem?
source to share
HashMap
the power is implemented as the power of two, so if 2 ^ 20 (1048576) isn't enough for you, you'll have to go with 2 ^ 21 (2097152).
EDIT:
Actually, you can control the capacity by specifying a high load factor.
If the exact maximum number of records is 1000100, the capacity of the HashMap will double if the number of records reaches the load factor *. So if the capacity is 1,048,576 and you don't want it to expand to 2,097,152, you need a load factor of about 0.954 or higher.
So, initializing an instance with the following constructor should do the trick:
HashMap<String,Integer> map = new HashMap<> (1048576, 0.954);
Relevant code (JDK 6):
public HashMap(int initialCapacity, float loadFactor) {
...
// Find a power of 2 >= initialCapacity
int capacity = 1;
while (capacity < initialCapacity)
capacity <<= 1;
this.loadFactor = loadFactor;
threshold = (int)(capacity * loadFactor);
table = new Entry[capacity];
...
}
and
void addEntry(int hash, K key, V value, int bucketIndex) {
Entry<K,V> e = table[bucketIndex];
table[bucketIndex] = new Entry<K,V>(hash, key, value, e);
if (size++ >= threshold) // this is what you want to avoid
resize(2 * table.length);
}
source to share
Share the key and
Map<Key1, Map<Key2, Value>
Use the TreeMap implementation for one map. If the second Maps are TreeMaps, they fill optimally, and if the main Map is a HashMap, maybe with a high loadFactor (second constructor parameter), then that should be fine. It is also better to solve collisions.
You can create your own implementation of the map implementation which.
source to share