-
Notifications
You must be signed in to change notification settings - Fork 0
Description
At the moment, the measurement aggregation frequency (step) is derived automatically based on the time range and the minimum number of datapoints that are requested -- this kinda works, but requires a bit of math by the requester if they just want to get data aggregated at a specific frequency.
Example use case: merit-nt data is collected at one minute intervals. It can be quite noisy, so we want to plot 5 min aggregated values on the UI graphs to smooth the time series on the graph a bit. Ideally, we would set a minStep parameter to 300 and it would just work. Instead, we currently would have to take the time range of our request, divide it by 300 to get the number of points we should ask for and set the maxDatapoints parameter to that value.
minStep, of course, can be an optional parameter and we may need some extra logic to handle cases where the user sets both minStep and maxDatapoints parameters but in a paradoxical way.