The relationship between the hashcode and equals methods and memory problems is not obvious at first. But we only need to think about hashmaps to make it more clear. An object's hashcode is used to insert and find objects in hashmaps. However, the hashcode is not unique, which is a why it only selects a bucket that can potentially contain multiple objects. Because of this, the equals method is used to make sure that we find the correct object. If the hashcode method is wrong (which can lead to a different result for otherwise-equal objects), we will never find an object in a hashmap. The consequence is often that the application inserts an object again and again.
Although the growing collection can easily be identified by most tools, the root cause is not obvious in a heap dump. I have seen this case over and over through the years, and one extreme case led the customer to run his JVMs with 40 GB of memory. The JVM still needed to be restarted once a day to avoid out-of-memory errors. We fixed the problem and now the application runs quite stable at 800 MB!
A heap dump — even if complete information on the objects is available — rarely helps in this case. One simply would have to analyze too many objects to identify the problem. The best approach is to be proactive and automatically unit-test comparative operators. A few free frameworks (such as EqualsVerifier) ensure that the equals and hashcode methods conform to the contract.
Use EqualsVerifier for Java UnitTests
It is taken from here http://www.dynatrace.com/en/javabook/other-java-memory-issues.html
No comments:
Post a Comment