Skip to content

logging: best practices for batching log entries #2402

@JoshFerge

Description

@JoshFerge

Hello, I am working on implementing a logging solution, and have a few questions that I could use help with. Here is some context:

  • We would like to log every request that our nodejs server processes to stackdriver logging.
  • This request is in JSON format and contains many fields.
  • Each server we have processes a large amount of RPS.
  • Time and performance is crucial to the service.
  • The documentation states, "While you may write a single entry at a time, batching multiple entries together is preferred to avoid reaching the queries per second limit."
  • It appears it should be a common pattern to have a wrapper around the logging API and only write entries when a certain number of them is accrued.

I have searched through the docs, and I haven't found a solid answer on these questions:

  • Will gcloud-winston or gcloud-bunyan handle batching log entries?
  • Is there an example of batching log entries in a high throughput scenario?
  • What is the recommended amount of entries to batch before writing?

Thank you for your help.

Metadata

Metadata

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions