Curious as to the status of this feature request, as I have a somewhat more fringe usage case.
I attempting some bulk data analysis, using some in-house data, and using the API to pull related data out of New Relic. Originally, I was trying to pull a 24 hour set of data, that I could parse through on my local machine (working from a BASH script). The problem is that 24 hours of New Relic data is typically in excess of 3,000,000 records. Obviously, that can’t currently be done.
I was able to rework my logic, so that I am only pulling a handful of records at a time, but now, rather than one large pull up front, I am doing hundreds of thousands of pulls. The result of having to wait on the API every time is that processing 24 hours of data is currently taking me over two weeks. (wholly unacceptable)
If I could pull all my data in one shot, even if it took an hour, it would only take my script four or five hours to parse though the local copy.
I understand the logic behind limiting the length of the returned data, but an override would be nice, for the odd times when someone needs to do some heavy lifting. (or maybe a separate API for bulk pulls?)