Conversation
Overall package sizeSelf size: 4.35 MB Dependency sizes| name | version | self size | total size | |------|---------|-----------|------------| | import-in-the-middle | 1.15.0 | 127.66 kB | 856.24 kB | | dc-polyfill | 0.1.10 | 26.73 kB | 26.73 kB |🤖 This report was automatically generated by heaviest-objects-in-the-universe |
This stack of pull requests is managed by Graphite. Learn more about stacking. |
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## master #7006 +/- ##
==========================================
- Coverage 84.77% 84.76% -0.01%
==========================================
Files 521 521
Lines 22149 22151 +2
==========================================
Hits 18776 18776
- Misses 3373 3375 +2 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
BenchmarksBenchmark execution time: 2025-12-17 12:38:44 Comparing candidate commit 2a18882 in PR branch Found 0 performance improvements and 0 performance regressions! Performance is the same for 291 metrics, 29 unstable metrics. |
ecf4d0c to
2c58667
Compare
This comment has been minimized.
This comment has been minimized.
90f07f4 to
a217410
Compare
7d33a6c to
693c32a
Compare
693c32a to
4f0478a
Compare
…payloads Replace the crude "delete all captures" approach with an intelligent pruning algorithm that selectively removes the largest and deepest leaf nodes while preserving the schema structure. The algorithm prunes like so: - Parses snapshots into a tree structure tracking JSON object positions - Uses a priority queue to select nodes for pruning based on: 1. Presence of `notCapturedReason: 'depth'` 2. Depth level (deeper nodes pruned first) 3. Presence of any `notCapturedReason` 4. Size (larger nodes pruned first) - Only prunes nodes at level 6 or deeper (`locals`) - Promotes parent nodes when all children are pruned to reduce overhead - Iteratively prunes if needed to reach target size
4f0478a to
8e9d8c7
Compare
8e9d8c7 to
2a18882
Compare
…payloads (#7006) Replace the crude "delete all captures" approach with an intelligent pruning algorithm that selectively removes the largest and deepest leaf nodes while preserving the schema structure. The algorithm prunes like so: - Parses snapshots into a tree structure tracking JSON object positions - Uses a priority queue to select nodes for pruning based on: 1. Presence of `notCapturedReason: 'depth'` 2. Depth level (deeper nodes pruned first) 3. Presence of any `notCapturedReason` 4. Size (larger nodes pruned first) - Only prunes nodes at level 6 or deeper (`locals`) - Promotes parent nodes when all children are pruned to reduce overhead - Iteratively prunes if needed to reach target size
…payloads (#7006) Replace the crude "delete all captures" approach with an intelligent pruning algorithm that selectively removes the largest and deepest leaf nodes while preserving the schema structure. The algorithm prunes like so: - Parses snapshots into a tree structure tracking JSON object positions - Uses a priority queue to select nodes for pruning based on: 1. Presence of `notCapturedReason: 'depth'` 2. Depth level (deeper nodes pruned first) 3. Presence of any `notCapturedReason` 4. Size (larger nodes pruned first) - Only prunes nodes at level 6 or deeper (`locals`) - Promotes parent nodes when all children are pruned to reduce overhead - Iteratively prunes if needed to reach target size
…payloads (#7006) Replace the crude "delete all captures" approach with an intelligent pruning algorithm that selectively removes the largest and deepest leaf nodes while preserving the schema structure. The algorithm prunes like so: - Parses snapshots into a tree structure tracking JSON object positions - Uses a priority queue to select nodes for pruning based on: 1. Presence of `notCapturedReason: 'depth'` 2. Depth level (deeper nodes pruned first) 3. Presence of any `notCapturedReason` 4. Size (larger nodes pruned first) - Only prunes nodes at level 6 or deeper (`locals`) - Promotes parent nodes when all children are pruned to reduce overhead - Iteratively prunes if needed to reach target size

What does this PR do?
Implements an intelligent snapshot pruning algorithm for Dynamic Instrumentation/Live Debugger that selectively removes the largest and deepest leaf nodes from oversized payloads while preserving the overall schema structure. This replaces the previous approach which simply deleted all captured variables when a snapshot exceeded the 1MB size limit.
Motivation
Align with how the other tracers prunes large snapshots.
Previously, when a snapshot payload exceeded 1MB, we would delete all captures entirely and show an error message to users. This was a poor user experience because users would lose all captured variable data, even though most of the snapshot was likely still valuable.
With this new pruning algorithm, we can intelligently reduce the size of oversized snapshots by removing only the least valuable data (deepest, largest nodes, and nodes that were already truncated due to depth limits), allowing users to still see most of their captured variables even when dealing with large data structures.
Additional Notes
The pruning algorithm:
notCapturedReason: 'depth'(highest priority - already truncated data)notCapturedReason(any truncated data)localsand below)