Parsing rules seems not applied to logs

I’m trying to parse logs from postgres. I configured newrelic infra and lags are properly pushed to newrelic but log parsing rules seems not working at all. I created parsing rule, set proper parsing regex (it is working in grok debugger):

%{DATESTAMP:timestamp} %{TZ} \[%{DATA:user_id}\](?:\s%{WORD}@%{WORD})? %{WORD:level}:%{GREEDYDATA:message}

which is working in “Parsed log messages” preview (all attributes are parsed and visible in preview):

Parsing results

2020-11-24 09:51:51.660 UTC [3472558] LOG:  listening on IPv4 address "", port 5432
Parsing results

  "level": "LOG",
  "message": "  listening on IPv4 address \"\", port 5432",
  "DATE_EU": "20-11-24",
  "user_id": "3472558",
  "timestamp": "20-11-24 09:51:51.660"

But newly received logs are not parsed (attributes are not visible in log table).

Also noticed that predefined logtype=nginx-error is not working when upstream is used in nginx.
message: 2020/11/24 08:38:36 [error] 3532#3532: *53025 recv() failed (104: Connection reset by peer) while reading response header from upstream, client:, server:, request: "GET /.git HTTP/1.1", upstream: "fastcgi://unix:/var/run/php/php7.3-fpm.sock", host: "", referrer: "" is not parsed but when i remove upstream: "fastcgi://unix:/var/run/php/php7.3-fpm.sock", part from log then parsing is working correctly.

Should I do some additional steps to make it work?

1 Like

@stat Sorry you have been waiting awhile for a response from our community. I’m going to bring this back to the attention of our support team. Thanks for your patience!

Neal Mc

Hi stat!

Our engineering team found that the service that publishes the parsing rules lost connection with our database which caused the issue you saw. They have re-established the connection and it should be working as expected.

If you are still experiencing unexpected behavior, please try refreshing affected pages and let me know what you are continuing to see.

1 Like

I believe I am also experiencing this issue. Can you confirm it is still working?

actually it appears to be working, it takes a while and only works on new incoming logs, not old ones. Sorry for the false alarm

1 Like

Thanks for letting us know @pboyd!

I appear to be having the same problem. I started with a built-in ruleset and parsing worked fine (I was able to see the parsed columns thru “Add Attribute”). Then I went to Manage Parsing and added a custom parser. The “Test Grok” worked great so I enabled the rule. But now I’m not seeing any of the parsed columns when I try to “Add Attribute”.

Any thoughts?

Hi @pete14, maybe @hzurek could take a look into this for you?

1 Like

hi, any news ? i having same problem

Also having the same issue. The parsing rule validates in the rule editor (shows green checkmarks next to all matching logs), but after activating it I don’t see it being applied to new logs.

I’m trying to parse Heroku router logs, and this is the rule I’m using:

at=%{LOGLEVEL:log_level} method=%{WORD:http_verb} path="%{URIPATHPARAM:http_path}" host=%{HOSTNAME:http_host} request_id=%{UUID:request_id} fwd="%{IP:ip_address}" dyno=%{DATA:dyno} connect=%{DATA:connect_duration} service=%{DATA:service_duration} status=%{NUMBER:http_status} bytes=%{NUMBER:bytes} protocol=%{WORD:http_protocol}

Is there anything I’m missing? When I click on logs that were generated by the Heroku router I don’t get any of these attributes.

I seem to have run into the same problem as @jared3. I’m using the Heroku data source provided by NR, and this is the rule I created:

at=%{LOGLEVEL:level} method=%{NOTSPACE:method} path="%{URIPATH:path}(?:%{URIPARAM:params})?" host=%{HOSTNAME:host} request_id=%{UUID:request_id} fwd=%{QS:route} dyno=%{NOTSPACE:dyno} connect=%{INT:connect_ms}ms service=%{INT:response_ms}ms status=%{INT:status} bytes=%{INT:response_bytes} protocol=%{URIPROTO:protocol}

(very similar to jared3’s)

I get lots of green check marks when I test it in the editor, but I don’t seem to be getting structured logs.

@gar I heard from a NR employee that this is a known bug that they are working on, and it relates specifically to Heroku logs coming into NR. I wasn’t given an ETA for a fix, unfortunately.

1 Like

Oh, awesome - thanks for the reply @jared3 !!

@gar and anyone else trying to parse logs coming out of Heroku (+ probably all syslog drains) – rough ETA on the fix is sometime in June.


Is this still roughly on track for some time this month?

+1 - super super interested in this! Particularly in light of this:

1 Like

Is this resolved? I am also forwarding logs from Heroku. Like others, I also created a custom grok pattern that doesn’t work.

at=%{DATA:at} method=%{DATA:method} path=%{DATA:path} host=%{DATA:host} request_id=%{DATA:request_id} fwd=%{QS} dyno=%{DATA:dyno} connect=%{DATA:connect} service=%{DATA:service} status=%{INT:status} bytes=%{INT:bytes} protocol=%{GREEDYDATA:protocol}

Can someone from New Relic confirm the status of this functionality?

If it is a known bug, can you let developers know?

If it is not a bug, can you provide additional documentation on how to get Heroku logs parsed?

Is there any update on this issue being resolved?

I have the same issue but not with Heroku. I have an OpenWRT router that sends firewall logs to a Rsyslog server which then forwards to New Relic over TCP using syslog-rfc5424 format. I want to parse the message field of the syslog message as that is where all the usable date is located. I created a custom GROK pattern for the parsing rule which gives green ticks for all my logs in the parsing editor. After applying the parsing rule, newly ingested logs do not have the additional attributes parsed.

Still seems not working. I created working parsing rule in debugger:, newrelic parsing test shows custom params but logs are not parsed.

The last update I received from a NR Rep (on Aug 30) was that this feature has no ETA. I think they had it on their upcoming feature list for a few months over the summer, and the backlogged it at the end of the summer.