Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[pkg/stanza/fileconsumer] Emit logs in batches #35455

Open
andrzej-stencel opened this issue Sep 27, 2024 · 1 comment
Open

[pkg/stanza/fileconsumer] Emit logs in batches #35455

andrzej-stencel opened this issue Sep 27, 2024 · 1 comment
Labels
enhancement New feature or request needs triage New item requiring triage pkg/stanza/fileconsumer

Comments

@andrzej-stencel
Copy link
Member

andrzej-stencel commented Sep 27, 2024

Component(s)

pkg/stanza/fileconsumer

Is your feature request related to a problem? Please describe.

This issue is created as a result of discussion in #31074.

The Stanza adapter's LogEmitter has a 100-log buffer that is a source of data loss during non-graceful collector shutdown. One solution is to remove this buffer, but this will likely come with a performance impact. This performance could be alleviated in case of Filelog receiver by implementing batching earlier in the Stanza pipeline - in the File consumer.

Describe the solution you'd like

From #31074 (comment):

Consider if it's feasible to modify the File consumer to send logs in batches. It seems natural (to me) for the File consumer to emit all logs read from one file in one pass as a single batch. The file consumer being able to emit batches of entries as opposed to single entries could alleviate the performance impact of removing batching from LogEmitter (possibly even improving the situation, as the "natural" batches could be larger than the current artificial 100 log limit). This would require the modification of all Stanza operators to be able to operate on batches of Stanza entries as opposed to single entry at a time. It seems a simple change to add a ProcessBatch method beside the existing Process method, with a for loop calling Process inside the ProcessBatch function.

and further in #31074 (comment):

In my opinion this could be a really nice improvement regardless of any other changes. The data coming out of the receiver will be more organized and intuitive. In terms of implementation, I think we can add a []string buffer to the Reader struct, and then emit the entire buffer, either when it reaches a max size (maybe configurable at a later time) or whenever we hit EOF. We can maintain a tentative offset which is saved only when this buffer is actually emitted.

Describe alternatives you've considered

  • Remove buffering in LogEmitter and live with the performance impact it brings.
  • Instead of removing the buffering, have it persisted so that the data is not lost during non-graceful shutdown.
  • Change the logic of marking logs as sent, so that logs are only marked as sent when they are actually successfully sent out to next consumer in the collector pipeline.

Additional context

See #31074 (comment) and the following comments.

@andrzej-stencel andrzej-stencel added enhancement New feature or request needs triage New item requiring triage labels Sep 27, 2024
Copy link
Contributor

Pinging code owners:

See Adding Labels via Comments if you do not have permissions to add labels yourself.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request needs triage New item requiring triage pkg/stanza/fileconsumer
Projects
None yet
Development

No branches or pull requests

1 participant