I’m pretty new to New Relic and I’m setting a custom metric to measure the time it takes SQS API to put a message on the Queue in Java like so:
NewRelic.recordMetric("Custom/bpregSqsSdkCallTime", toIntExact(afterSqsSendTimestamp - beforeSqsSendTimestamp));
After calling the method a few times, my dashboard displays something like this:
If within 1 minute setting this metric is called 2 times (call count = 2) with values of 8 (min) and 88 (max), I would expect the std dev to be 40 ( https://www.calculator.net/standard-deviation-calculator.html?numberinputs=8%2C+88&ctype=p&x=87&y=18).
For some reason Std dev in this particular case is 36600 ms. I’m assuming that since the unit here is ms, this is the value of something totally different than the standard deviation based on the custom metric values. Can you please help me what this value actually represents and is it possible for new relic to display actual standard deviation value for my metric?
Thank you for any help.