-
Notifications
You must be signed in to change notification settings - Fork 653
Closed
Labels
api: loggingIssues related to the Cloud Logging API.Issues related to the Cloud Logging API.api: logging-bunyanapi: logging-winstontype: questionRequest for information or clarification. Not an issue.Request for information or clarification. Not an issue.
Description
Hello, I am working on implementing a logging solution, and have a few questions that I could use help with. Here is some context:
- We would like to log every request that our nodejs server processes to stackdriver logging.
- This request is in JSON format and contains many fields.
- Each server we have processes a large amount of RPS.
- Time and performance is crucial to the service.
- The documentation states, "While you may write a single entry at a time, batching multiple entries together is preferred to avoid reaching the queries per second limit."
- It appears it should be a common pattern to have a wrapper around the logging API and only write entries when a certain number of them is accrued.
I have searched through the docs, and I haven't found a solid answer on these questions:
- Will gcloud-winston or gcloud-bunyan handle batching log entries?
- Is there an example of batching log entries in a high throughput scenario?
- What is the recommended amount of entries to batch before writing?
Thank you for your help.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
api: loggingIssues related to the Cloud Logging API.Issues related to the Cloud Logging API.api: logging-bunyanapi: logging-winstontype: questionRequest for information or clarification. Not an issue.Request for information or clarification. Not an issue.