Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Receiver] Panic on calling LogRecordCount() in receiver #10625

Open
grandwizard28 opened this issue Jul 16, 2024 · 6 comments
Open

[Receiver] Panic on calling LogRecordCount() in receiver #10625

grandwizard28 opened this issue Jul 16, 2024 · 6 comments
Labels
bug Something isn't working

Comments

@grandwizard28
Copy link

Describe the bug
On calling LogRecordCount() after Consume in a receiver, the receiver seems to panic and throw a nil pointer error.
A redacted version of the receiver looks like this:

...
logs, err := receiver.parser.Parse(body)
if err != nil {
	writeError(w, err, http.StatusBadRequest)
	return
}

// At this point, the receiver has accepted the payload
ctx := receiver.obsreport.StartLogsOp(req.Context())
err = receiver.nextConsumer.ConsumeLogs(ctx, logs)
receiver.obsreport.EndLogsOp(ctx, metadata.Type.String(), logs.LogRecordCount(), err)

if err != nil {
	writeError(w, err, http.StatusInternalServerError)
	return
}
...

The redacted stack trace looks like this:

go.opentelemetry.io/collector/pdata/plog.Logs.LogRecordCount({0xc0018347c8?, 0xc00288a8fc?})
/home/runner/go/pkg/mod/go.opentelemetry.io/collector/[email protected]/plog/generated_resourcelogsslice.go:56
go.opentelemetry.io/collector/pdata/plog.ResourceLogsSlice.At(...)
/opt/hostedtoolcache/go/1.22.5/x64/src/runtime/panic.go:770 +0x132
panic({0x1792160?, 0x2d2a0f0?})
/opt/hostedtoolcache/go/1.22.5/x64/src/net/http/server.go:1903 +0xbe
net/http.(*conn).serve.func1()
goroutine 181262 [running]:
2024/07/16 15:52:25 http: panic serving 10.52.8.48:57874: runtime error: invalid memory address or nil pointer dereference

Steps to reproduce
The error is happening at every throughput. [Close to 60K log records]

What did you expect to see?
Panic not to happen

What did you see instead?
Panic

What version did you use?

go.opentelemetry.io/collector v0.103.0
go.opentelemetry.io/collector/component v0.103.0
go.opentelemetry.io/collector/pdata v1.10.0

Additional context
#10402

@grandwizard28 grandwizard28 added the bug Something isn't working label Jul 16, 2024
@atoulme
Copy link
Contributor

atoulme commented Jul 16, 2024

What's your pipeline looking like, as in, what processors and exporters did you use after this receiver?

@grandwizard28
Copy link
Author

The logs pipeline looks like this:

    logs:
      receivers: [otlp, customreceiver]
      processors: [batch]
      exporters: [kafkaexporter]

@grandwizard28
Copy link
Author

grandwizard28 commented Jul 16, 2024

I have a hunch that doing the below:

logs, err := receiver.parser.Parse(body)
if err != nil {
  writeError(w, err, http.StatusBadRequest)
  return
}
numLogs := logs.LogRecordCount()

// At this point, the receiver has accepted the payload
ctx := receiver.obsreport.StartLogsOp(req.Context())
err = receiver.nextConsumer.ConsumeLogs(ctx, logs)
receiver.obsreport.EndLogsOp(ctx, metadata.Type.String(), numLogs, err)

will fix the issue. This is basically how it's being done in the otlpreceiver.

@atoulme
Copy link
Contributor

atoulme commented Jul 16, 2024

That will fix your issue, for sure.

@grandwizard28
Copy link
Author

This fixed it @atoulme.
Do you think we can make a point somewhere in the documentation for this?

@crobert-1
Copy link
Member

Looks like a relatively common issue, as shown in open-telemetry/opentelemetry-collector-contrib#29274. It would be good to document this somewhere.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants