Your data. Anywhere you go.

New Relic for iOS or Android


Download on the App Store    Android App on Google play


New Relic Insights App for iOS


Download on the App Store


Learn more

Close icon

Feature Idea: Postgres - to find number of rows inserted per second

postgres

#1

Hi,

I am trying to write NRQL which gives me number of row inserted per second. I am capturing the tup_inserted from the postgres internal relation at regular interval in NRQL table.

I was looking for if any standard function available to find number of rows inserted per second and I tried rate but it did not work. If I am able to find somehow the differences over time, that should serve my purpose.

Any idea how to achieve this using NRQL?

Thanks,


New Relic edit

  • I want this, too
  • I have more info to share (reply below)
  • I have a solution for this

0 voters

We take feature ideas seriously and our product managers review every one when plotting their roadmaps. However, there is no guarantee this feature will be implemented. This post ensures the idea is put on the table and discussed though. So please vote and share your extra details with our team.


#2

Can you provide a sample of the data in your events?


#3

Sure, Thanks for looking into it.
Here you go. These are total tuples inserted at a point in time. I would like to create a graph which shows tuples_inserted/s

30 Aug 13:18:10 245,169,776
30 Aug 13:17:39 245,168,512
30 Aug 13:17:09 245,167,104
30 Aug 13:16:39 245,165,696
30 Aug 13:16:09 245,164,304
30 Aug 13:15:39 245,162,960
30 Aug 13:15:09 245,161,408
30 Aug 13:14:38 245,160,128
30 Aug 13:14:08 245,158,928
30 Aug 13:13:38 245,157,600

Thanks Again.


#4

You want to count the insights rows/events per second or sum the value in one of those columns. Can you provide column names?


#5

Column names are like Timestamp and tup_inserted.

Timestamp tup_inserted


30 Aug 13:18:10 245,169,776
30 Aug 13:17:39 245,168,512

So between the above two time intervals I have (245,169,776 - 245,168,512) tuples or row inserted. I would like to graph number of tup_inserted/sec.

Hope I could explain it here.

Thanks.


#6

Unless I am mistaken you might need to ensure the event values are numbers and not strings.

Also, if you want second granularity you don’t seem to have it in your event intervals. Meaning in the example given it looks like there is a 29 second interval.


#7

Yes, we capture metrics at 30 secs interval and tup_inserted column in the source database is defined as integer. I thought there should be a way to create metrics /per second using some formula.

Also, let’s assume I need granularity at 29 or 30 whatever interval it captures, what would have be en the NRQL?

Thanks again, for continuously looking into it for me.


#8

For a test try to sum the field. If you can then you can use rate() or use your own math.


#9

Yes, I tried that already before posting it here.
Something like :slight_smile:

SELECT rate(sum(tup_inserted),30 seconds) from .

But it is giving me result in like 250M range, but as you can see from above data difference between 30 second interval is just in few hundreds.

Thanks,


#10

SELECT rate(sum()) is going to provide the average value of the attribute. There isn’t a way to determine the difference between the latest value and the second latest value of tup_inserted outside of including that as it’s own custom attribute calculated on your end prior to sending the event to New Relic.


#11

Sad to hear that. The source database will have only 1 row at any point of time.
That is the all purpose of having monitoring tool, collect the metrics every few time intervals and provide the graphs. If that is the something I have to do out side of the NR and that in my opinion it defeat the purpose. We have other monitoring tool which provides the required result which the same feed.

Thanks for your time.


#12

Curious which tools.


#13

like datadog, zabbix.


#14

We also use Datadog.

I think there are 4 available plugins for New Relic to support Postgres in various ways. Might be worth a look.


#15

@sparikh I think your use case would make a great feature request, so I’ve added it to our feature request section. This will allow other users to contribute to the conversation, share their use cases, and vote on the idea. Personally, I would love the ability to query on latest, latest-1, latest-10, etc.


#16

Is that time? What would you do with that?


#17

-1 would be the second latest event
-2 would be the third latest event
and so on…

This would make it easy to calculate differences in attribute values between events.
@6MM