From a1435691dbcd098e3d4cd522154e1987ff30db3b Mon Sep 17 00:00:00 2001 From: Dathan Liblik Date: Tue, 18 Aug 2020 13:38:34 -0400 Subject: [PATCH 1/6] Initial submission of read-cached-last-value RFC --- text/0000-read-cached-last-value.md | 157 ++++++++++++++++++++++++++++ 1 file changed, 157 insertions(+) create mode 100644 text/0000-read-cached-last-value.md diff --git a/text/0000-read-cached-last-value.md b/text/0000-read-cached-last-value.md new file mode 100644 index 0000000000..7afc6adfea --- /dev/null +++ b/text/0000-read-cached-last-value.md @@ -0,0 +1,157 @@ +- Start Date: 2020-08-18 +- Relevant Team(s): Ember.js +- RFC PR: +- Tracking: + +# Read Previous Cached Value by Convention from within @cached Getters + +## Summary + +Allow a conventional way for RFC 566's @cached getters to retrieve +the last-returned value from within the next execution of the getter +to reduce computation load for complex results that are incremental +in nature, such as this pattern: + +```js +import { tracked, CACHED } from '@glimmer/tracking'; + +class OnlineItemAuction { + /** + Bidders who join this particular item auction. + */ + @tracked bidders = []; + + /** + Each bidder is a source of bids with history that + must be maintained for the duration of the auction. + */ + @cached + get bidSources() { + // Available by convention for any @cached getter - has + // same key as the getter. + const _prevs = this[CACHED].bidSources || []; + const bidders = this.bidders; + + // Return prior ones, plus any additional items, each + // of which must maintain prior state. + return bidders.map(bidder => { + return _prevs.find(prev => prev.bidder === bidder) || + // expensive as it downloads initial state from server + new BidSource(bidder); + }); + } +} +``` + +## Motivation + +This is not an uncommon use case: in particular, incremental arrays +of complex-to-create instances are very common, especially where +the feeder array (bidders) increments over time like above. + +Despite this appearing at first glance to be a more advanced +use case, I have found empirically over the past 7 years +that this is one of a (shrinking) list of cases that arises quite +often and creates frustration among solid-JS but new-to-Ember +developers. +In this case, their experience causes recognition of the need to +memoize, but there is no solution by convention. This seems +counterintuitive (to me as well) as frameworks like Ember are all +about optimizing computation and rendering of UI-bound data. + +Perhaps more problematically, junior developers come up with some +pretty varied and hard-to-debug versions of this pattern and +that can really slow adoption and confidence (not to mention +code reviews and QA). + +All levels would be helped tremendously by an out-of-the-box +conventional solution, and furthermore, this sits very well with +the @cached decorator concept (in fact, the implementation if +@cached is in place is 2 extra lines of code if done as above). + +## Detailed design + +A getter that is marked for memoization using `@cached` can access +its last-returned value by reading a Symbol-driven (or otherwise +conventional) property on its own instance using the same key +as the property's name. + +At each read of the getter, which is intercepted already by +@cached, the value returned by the @cached descriptor's handing +is assigned to the object instance, more or less as follows: + +### Updates to @cached getter wrapper + +- Existing step: Read value to return from cache or getter as + case may be +- __New step:__ Store value in a simple hash stored at + `this[CACHED]` in a key matching the name of the property + getter: + `this[CACHED][key] = value` +- Existing step: Return the value + +### Resource Leaks + +There is very little risk, but if necessary, `this[CACHED]` could +be wrapped in a WeakMap with one key (the instance) to break a +potential cycle if a cached value references its instance, but +that's generally a problem already - this doesn't make it worse. + +Beyond that, the only overhead is the hash since the data itself is +presumably referenced elsewhere as well (i.e. garbage collection is +unlikely to be delayed on the data since it was gotten to be used). + +## How we teach this + +An extension of the `@cached` documentation is the natural place to +put this, using a strong and broadly-relatable anecdote +(like bidding in the example). +Building up an example of such a use-case from scratch would +certainly make the need for this feature very evident. + +Assuming a knowledge of `@cached` (or any experience with it), it +doesn't take long to end up needing this additional feature as +`@cached` itself is designed to limit re-computation. Combine that +with an array-type property and you have a very likely demand +for this pattern. + +Existing users are apt to pursue memoization topics if they have had +anything more than a trivial exposure to Ember (though perhaps +not by this unfortunately-less-than-colloqial name for a concept +that is itself quite practically simple). +Across the 20+ production-grade apps we've built on Ember, I can't +think of a single one where this wasn't important for performance +in at least some circumstance of the application. + +No additional reorganization is needed for the guides. + +## Drawbacks + +Direct prior-cached-value-reading may come across as an exposed +internal working of the platform required to offset a design +flaw. +However this is a specialized case and therefore merits +a more specific solution. +Review of this RFC may lead to better ergonomics to address this. + +Use of this as a crutch for an absence of better coding practices +may also arise, but this approach is not so ergonimic as to be +easier than "the right way" to handle a different use case. +In fact, it only applies if `@cached` is in use, limiting the +domain of potential misuse. + +## Alternatives + +- Let the developer continue to manage this on a case-by-case + basis +- Provide a base class that implements this 'automagically', + however that's exactly the opposite direction Ember and those + like it are moving in. + +## Unresolved questions + +- Is this the best ergonomic implementation? +- Is there an issue with the possibility that one getter can access + another getter's prior value (since `this[CACHED]` can + have any key read - is this actually a beneficial + side-effect?) From a82b8d98b6da74bad53abad8a4a1d37c1e57c4bb Mon Sep 17 00:00:00 2001 From: Dathan Liblik Date: Tue, 18 Aug 2020 13:43:35 -0400 Subject: [PATCH 2/6] Add PR link to RFC header --- text/0000-read-cached-last-value.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/text/0000-read-cached-last-value.md b/text/0000-read-cached-last-value.md index 7afc6adfea..e7e5d3d5f2 100644 --- a/text/0000-read-cached-last-value.md +++ b/text/0000-read-cached-last-value.md @@ -1,6 +1,6 @@ - Start Date: 2020-08-18 - Relevant Team(s): Ember.js -- RFC PR: +- RFC PR: https://github.com/emberjs/rfcs/pull/656 - Tracking: # Read Previous Cached Value by Convention from within @cached Getters From 5b2ab8a07b27ed7f5f27ac94ab9b40661618c252 Mon Sep 17 00:00:00 2001 From: Dathan Liblik Date: Tue, 18 Aug 2020 15:15:02 -0400 Subject: [PATCH 3/6] Refine example and some text cleanup --- text/0000-read-cached-last-value.md | 26 +++++++++++++++++--------- 1 file changed, 17 insertions(+), 9 deletions(-) diff --git a/text/0000-read-cached-last-value.md b/text/0000-read-cached-last-value.md index e7e5d3d5f2..ef8f54280f 100644 --- a/text/0000-read-cached-last-value.md +++ b/text/0000-read-cached-last-value.md @@ -10,30 +10,38 @@ Allow a conventional way for RFC 566's @cached getters to retrieve the last-returned value from within the next execution of the getter to reduce computation load for complex results that are incremental -in nature, such as this pattern: +in nature, such as this example of an online auction where each +participant can see the bids sourced from the others: ```js -import { tracked, CACHED } from '@glimmer/tracking'; +import { tracked } from '@glimmer/tracking'; +import { cached, CACHED } from '...source module?'; class OnlineItemAuction { /** - Bidders who join this particular item auction. + Bidders who have joined this particular item auction - this array + is set from elsewhere and grows item by item over time. */ @tracked bidders = []; /** - Each bidder is a source of bids with history that - must be maintained for the duration of the auction. + Each bidder becomes a source of bids for this auction + with history that is displayed locally for the duration + of the auction and which initially comes from the server + which is expensive, and so this array result should + not fully regenerate, but incrementally regenerate instead + as new bidders arrive. */ @cached get bidSources() { - // Available by convention for any @cached getter - has - // same key as the getter. + // Read the previously returned set of bid sources + // so that only new ones need be generated for newly + // arrived bidders const _prevs = this[CACHED].bidSources || []; const bidders = this.bidders; - // Return prior ones, plus any additional items, each - // of which must maintain prior state. + // Generate a BidSource from each bidder, but avoid doing + // it more than once per bidder. return bidders.map(bidder => { return _prevs.find(prev => prev.bidder === bidder) || // expensive as it downloads initial state from server From 662cc762ca5b580dfb883b2bd5f1052749588512 Mon Sep 17 00:00:00 2001 From: Dathan Liblik Date: Thu, 20 Aug 2020 10:06:21 -0400 Subject: [PATCH 4/6] Update to reflect `@cacheFor(propName)` approach --- text/0000-read-cached-last-value.md | 98 ++++++++++++++--------------- 1 file changed, 46 insertions(+), 52 deletions(-) diff --git a/text/0000-read-cached-last-value.md b/text/0000-read-cached-last-value.md index ef8f54280f..eb83fb18dd 100644 --- a/text/0000-read-cached-last-value.md +++ b/text/0000-read-cached-last-value.md @@ -7,23 +7,36 @@ ## Summary -Allow a conventional way for RFC 566's @cached getters to retrieve -the last-returned value from within the next execution of the getter -to reduce computation load for complex results that are incremental -in nature, such as this example of an online auction where each -participant can see the bids sourced from the others: +Introduce the `@cacheFor(propName)` decorator to provide +a conventional way for RFC 566's `@cached` getters to retrieve +the last-returned value from within the next execution of the getter, +enabling computation load reduction for complex results that are +incremental nature. Consider this example of an online auction where +participants arrive slowly and each can see the bids sourced from the +others (showing the proposed `@cacheFor` feature): ```js import { tracked } from '@glimmer/tracking'; -import { cached, CACHED } from '...source module?'; +import { cached, cacheFor } from '...'; class OnlineItemAuction { /** - Bidders who have joined this particular item auction - this array - is set from elsewhere and grows item by item over time. + Bidders who have joined this particular item auction - this + array is set from elsewhere and grows item by item over time. */ @tracked bidders = []; + /** + This (read only) property returns the most recently-returned + result of the bidSources getter below, or an initial value + if the getter has not been called yet. This property is not + useful to be bound elsewhere - it is 'volatile' in that it is + calculated on demand when called by simply reading the last + value in the cache for the desired property (or the initial + value provided). + */ + @cacheFor('bidSources') _bidSources = []; + /** Each bidder becomes a source of bids for this auction with history that is displayed locally for the duration @@ -37,7 +50,7 @@ class OnlineItemAuction { // Read the previously returned set of bid sources // so that only new ones need be generated for newly // arrived bidders - const _prevs = this[CACHED].bidSources || []; + const _prevs = this._bidSources; const bidders = this.bidders; // Generate a BidSource from each bidder, but avoid doing @@ -73,41 +86,29 @@ that can really slow adoption and confidence (not to mention code reviews and QA). All levels would be helped tremendously by an out-of-the-box -conventional solution, and furthermore, this sits very well with -the @cached decorator concept (in fact, the implementation if -@cached is in place is 2 extra lines of code if done as above). +conventional solution, and furthermore, this sits very well on top +of the @cached decorator concept (in fact, the implementation if +@cached is in place is a few extra lines of code if done as above). ## Detailed design A getter that is marked for memoization using `@cached` can access -its last-returned value by reading a Symbol-driven (or otherwise -conventional) property on its own instance using the same key -as the property's name. +its last-returned value by reading a property on its own instance +identified with the proposed `@cacheFor(propName)` decorator. -At each read of the getter, which is intercepted already by -@cached, the value returned by the @cached descriptor's handing -is assigned to the object instance, more or less as follows: +The `@cacheFor(propName)` decorator is itself a volatile getter +that simply returns the last value be reading the cache. This is +done on demand and no state is stored in the getter itself. If the +cache is not found to exist yet for the related cached property, +then the initial value (if provided) is returned instead. -### Updates to @cached getter wrapper - -- Existing step: Read value to return from cache or getter as - case may be -- __New step:__ Store value in a simple hash stored at - `this[CACHED]` in a key matching the name of the property - getter: - `this[CACHED][key] = value` -- Existing step: Return the value +> To Do: Put in sample implementation code later today ### Resource Leaks -There is very little risk, but if necessary, `this[CACHED]` could -be wrapped in a WeakMap with one key (the instance) to break a -potential cycle if a cached value references its instance, but -that's generally a problem already - this doesn't make it worse. - -Beyond that, the only overhead is the hash since the data itself is -presumably referenced elsewhere as well (i.e. garbage collection is -unlikely to be delayed on the data since it was gotten to be used). +This is a very clean and lightweight solution - the cache exists +already (or not, if never created), but in either case, this +decorator does not create additional resource load or references. ## How we teach this @@ -118,34 +119,27 @@ Building up an example of such a use-case from scratch would certainly make the need for this feature very evident. Assuming a knowledge of `@cached` (or any experience with it), it -doesn't take long to end up needing this additional feature as +doesn't take long to end up looking for the proposed functionality as `@cached` itself is designed to limit re-computation. Combine that with an array-type property and you have a very likely demand -for this pattern. +for the pattern addressed by `@cacheFor(propName)`. Existing users are apt to pursue memoization topics if they have had -anything more than a trivial exposure to Ember (though perhaps -not by this unfortunately-less-than-colloqial name for a concept -that is itself quite practically simple). +anything more than a trivial exposure to Ember. Across the 20+ production-grade apps we've built on Ember, I can't think of a single one where this wasn't important for performance in at least some circumstance of the application. +Unfortunately, implementation of solutions for this issue can be +highly varied as Ember (currently) leaves incremental state as +a matter for the end-developer to address. No additional reorganization is needed for the guides. ## Drawbacks -Direct prior-cached-value-reading may come across as an exposed -internal working of the platform required to offset a design -flaw. -However this is a specialized case and therefore merits -a more specific solution. -Review of this RFC may lead to better ergonomics to address this. - Use of this as a crutch for an absence of better coding practices -may also arise, but this approach is not so ergonimic as to be -easier than "the right way" to handle a different use case. -In fact, it only applies if `@cached` is in use, limiting the +may arise, but this approach is not really beneficial to other cases. +As proposed, it only applies if `@cached` is in use, limiting the domain of potential misuse. ## Alternatives @@ -160,6 +154,6 @@ domain of potential misuse. - Is this the best ergonomic implementation? - Is there an issue with the possibility that one getter can access - another getter's prior value (since `this[CACHED]` can - have any key read - is this actually a beneficial + another getter's prior value (since the prior value can + be read anywhere - is this actually a beneficial side-effect?) From 5bfa818b1409d861f255063a0791756dd8b49ccc Mon Sep 17 00:00:00 2001 From: Dathan Liblik Date: Thu, 20 Aug 2020 10:10:56 -0400 Subject: [PATCH 5/6] Updated filename to reflect assigned RFC number --- ...0-read-cached-last-value.md => 0656-read-cached-last-value.md} | 0 1 file changed, 0 insertions(+), 0 deletions(-) rename text/{0000-read-cached-last-value.md => 0656-read-cached-last-value.md} (100%) diff --git a/text/0000-read-cached-last-value.md b/text/0656-read-cached-last-value.md similarity index 100% rename from text/0000-read-cached-last-value.md rename to text/0656-read-cached-last-value.md From 849e0cc306f98aa4f5a3143f7f8929e0d5783e90 Mon Sep 17 00:00:00 2001 From: Dathan Date: Thu, 20 Aug 2020 10:14:23 -0400 Subject: [PATCH 6/6] Update title to reflect recent changes --- text/0656-read-cached-last-value.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/text/0656-read-cached-last-value.md b/text/0656-read-cached-last-value.md index eb83fb18dd..a3fecc0496 100644 --- a/text/0656-read-cached-last-value.md +++ b/text/0656-read-cached-last-value.md @@ -3,7 +3,7 @@ - RFC PR: https://github.com/emberjs/rfcs/pull/656 - Tracking: -# Read Previous Cached Value by Convention from within @cached Getters +# Read Previous @cached value with new @cacheFor Decorator ## Summary