[Java] Memory leak with com.newrelic.weave.java.sql.Preparedstatement835070183_nr_ext

HI we are using the newrelic java agent apm in our web app. We found heap swelling in our web app.We got a heap dump done for our java web application. One of the suspects was as below from the memory analyzer.

the culprits reported were: memory leak with com.newrelic.weave.java.sql.Preparedstatement835070183_nr_ext

com.newrelic.agenet.deps.com.google.common.cache.LocalCache$Segment[] loaded by com.newrelic.bootstrap.BootstrapAgent$JVMAgentClassLader

Pl. let me know if is an issue because we are at an older version of NR and it is fixed and should be updating ?

Hi @chida.jeyabalan,

Hopefully you came across this blog for how to troubleshoot memory leaks, but sharing in case you didn’t:

To answer your question about upgrading, yes that is a very good idea because we constantly make changes and improvements to later versions - you may find that an upgrade is an easy fix.


This may also be of help, however, it needs some input from New Relic (what their max # of prepared statements is set to). You may see this message in a heap dump analysis IF the max # configuration for prepared statements is too high, e.g., 100 or more. I found this out the hard way thinking “more is better” when I was doing some tuning so I had increased the prepared cache statement limit in the JVM to 125 and got a similar issue. Reading up on this I found a way to lower by inspecting the # of actual prepared statements in use at my peak load levels, and then lowering the max setting to just a small amount above that level, e.g., round up to the nearest multiple of 5. For example if your observation is 51 prepared statements in use round up to 55 for your max setting. In most cases, I ended up lowering the value from 125 to between 25 and 30. Hope this helps.

@chida.jeyabalan Let us know how this all works out for you!