Your data. Anywhere you go.

New Relic for iOS or Android

Download on the App Store    Android App on Google play

New Relic Insights App for iOS

Download on the App Store

Learn more

Close icon

Exporting/copying all INSIGHTS data for use locally



Continuing the discussion from Insights? Locked in Data? how to export to csv/ excel or google spreadsheets?:

We have a larger data store locally that could be used to retain history of the data we generate. It would be helpful to be able to retain data longer and to issue overview queries over long intervals for senior management with local tools instead of relying on NRQL or accounts with New Relic to do this.

I can see how we can get a Query Key and issue NRQL queries to accomplish this as long as we simply query all records for short intervals of time so that we don’t overwhelm the response count. I was wondering if there is a better way to get larger volumes of data per query, though. Is there a way to download an hour of data at a time without hitting limits on response sizes? Please offer advice on this process. I’m basically looking to have a running process regularly pick up all data as it is acquired so that we can query it and maintain history offline.

Thanks in advance,


Hi @Jeremy_Buch

Data retention in Insights is tied to pricing and based on your subscription level. I’m afraid we do not have any suggestions for bulk data exports or avoiding restrictions on the Query API beyond what you’ve already described.

If you’re encountering areas where NRQL and Insights are not sufficient, we’d be very interested in learning more about how we can make our product better serve your needs.

If you’d like to provide some details I may have some alternate workaround suggestions to offer, and if not we’ll have a great basis for submitting a feature request and getting this feedback to our product management team.


Hello Andy,

Basically we are looking to integrate into an existing executive dashboard. It is great to have historical queries and data without having to pay NewRelic to keep a large backlog of data. We have terabytes of storage available and could start holding data for the long-term in order to do historical queries without the cost of having that data hosted. We would probably aggregate historical data to match our query needs (on a daily or other basis), but pulling down data in order to maintain this history (without the cost of storing non-aggregated data in NewRelic) would be beneficial.

At the moment it looks like simply querying a second’s or a minute’s worth of data wouldn’t overwhelm the query results. Iterating back through time would allow this process to pull data. I still don’t see a way to clean up historical data after we have pulled it down, but I would expect that it can be aged out in some way.

Please advise,


At this time we have no plans to add support for this type of mass data export from our system.

However, I’ve fully documented your intended use case and filed a feature request on your behalf in order to pass this feedback along to our product management team.

For now, you’ll need to continue working as you have been within the limits of our existing Query API functionality.

As cost seems to be one of the concerns here, the only other suggestion I can think of is that you may want to reach out and get in touch with your account representative. They will be best equipped to discuss pricing terms with you.


+1 on desire to download insights tables for working locally. The restriction to 99 events is extremely limiting.


We’ve got your +1 logged, @eric.taylor. Thanks for making your voice heard!


Hey Andy,

It’s been over a year when you last said

At this time we have no plans to add support for this type of mass data export from our system.

We would also like to be able to download insights data for working locally, so I thought I’d revive this topic to see if New Relic’s plans have changed.

Best Regards,


@netheroth or @AndyC - Have you had the opportunity to review my message?


I agree, there must be a way for us to archive our own data to produce reports going back beyond 90 days.


I too wish we had a way that the dashboard provided a trigger for exporting the local data viewed. That would help prevent needing to create API that reaches limits due to requesting too much, and would only send the relevant data intended.

An update on NR stance for this functionality would be nice. I don’t necessarily support the ask to just expand the API limits for free usage. I get that interferes directly with the NR business model.



+1 on desire to download insights tables for working locally. The restriction to 99 events is extremely limiting.


+1 same here, data download would be handy. For the bulk of the data we don’t need multi-year retention, but for rolled-up usage numbers it would be interesting to be able to download and e.g. load up into Segment or other analytical tools.