Conversation
Codecov Report❌ Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #1096 +/- ##
==========================================
- Coverage 87.57% 87.55% -0.02%
==========================================
Files 55 55
Lines 7500 7580 +80
Branches 7500 7580 +80
==========================================
+ Hits 6568 6637 +69
- Misses 633 640 +7
- Partials 299 303 +4 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
There was a problem hiding this comment.
Pull request overview
This PR implements investment limits for processes using the process_investment_constraints.csv input file that was introduced in PR #1020. Investment limits cap the amount each agent can invest in a process based on annual addition limits, scaled by the agent's share of commodity demand and the number of years elapsed since the previous milestone year.
Changes:
- Added
calculate_investment_limits_for_candidatesfunction to compute per-agent investment limits - Changed
addition_limitfield type fromf64toCapacityfor better type safety - Added
get_annual_addition_limitmethod to support future extension with growth and total capacity limits
Reviewed changes
Copilot reviewed 5 out of 5 changed files in this pull request and generated no comments.
Show a summary per file
| File | Description |
|---|---|
| src/simulation/investment.rs | Implements the core logic for calculating and applying investment limits to candidate assets, integrates limits into the asset selection process, and adds comprehensive tests |
| src/process.rs | Changes addition_limit type to Capacity and adds get_annual_addition_limit method for future extensibility |
| src/input/process/investment_constraints.rs | Updates type handling throughout to use Capacity instead of f64 and updates all tests accordingly |
| examples/two_regions/process_investment_constraints.csv | Adds example investment constraint data for the two_regions example |
| examples/muse1_default/process_investment_constraints.csv | Adds example investment constraint data for the muse1_default example |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
|
Also, forgot to mention in the description, but I've added investment limits for the And one other thing I've just realised is that these limits won't necessarily be complied with in the circularities algorithm, since we allow capacities to change after the (now constrained) investment stage within |
|
Converting back to draft - some things I want to tidy up |
|
@alexdewar This is getting a bit fiddly. Since investment limits depend on an agent property (namely the agent's It would be so much easier if assets stored the |
Yes I do! There have been a couple of times where I started doing that for other reasons, but ended up not needing it, so never opened a PR. I haven't looked at this PR yet, but we have a couple of options. We could merge as is (possibly adding a warning) or leave this branch here and I can fix it up once we've done the other refactor. Let's talk about it in the meeting. |
|
@alexdewar Sounds good. I've added the constraints for the circularities algorithm in b2eb10d as I wanted this to be complete, albeit a bit messy. If we go down the route of what we've suggested above, then we could just retrieve the investment limit directly for each asset in But, with that in mind, I think this is ready to review |
There was a problem hiding this comment.
Pull request overview
Copilot reviewed 7 out of 7 changed files in this pull request and generated 6 comments.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
There was a problem hiding this comment.
Pull request overview
Copilot reviewed 7 out of 7 changed files in this pull request and generated 3 comments.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| AssetCapacity::Continuous(cap) => { | ||
| // Continuous capacity: capacity variable represents total capacity | ||
| let lower = ((1.0 - capacity_margin) * cap.value()).max(0.0); | ||
| let upper = (1.0 + capacity_margin) * cap.value(); | ||
| let mut upper = (1.0 + capacity_margin) * cap.value(); | ||
| if let Some(limit) = capacity_limit { | ||
| upper = upper.min(limit.total_capacity().value()); | ||
| } | ||
| problem.add_column(coeff.value(), lower..=upper) | ||
| } | ||
| AssetCapacity::Discrete(units, unit_size) => { | ||
| // Discrete capacity: capacity variable represents number of units | ||
| let lower = ((1.0 - capacity_margin) * units as f64).max(0.0); | ||
| let upper = (1.0 + capacity_margin) * units as f64; | ||
| let mut upper = (1.0 + capacity_margin) * units as f64; | ||
| if let Some(limit) = capacity_limit { | ||
| upper = upper.min(limit.n_units().unwrap() as f64); | ||
| } | ||
| problem.add_integer_column((coeff * unit_size).value(), lower..=upper) |
There was a problem hiding this comment.
When capacity_limit is smaller than the current capacity (or smaller than the lower bound implied by capacity_margin), upper can become < lower, which can make the optimisation problem invalid/infeasible or trigger a solver error. After applying the limit, clamp the bounds so lower <= upper (e.g. reduce lower to upper, or choose a consistent policy such as fixing the variable at the limited upper bound).
| // Retrieve installable capacity limits for flexible capacity assets. | ||
| let key = (commodity_id.clone(), year); | ||
| let mut agent_share_cache = HashMap::new(); | ||
| let capacity_limits = flexible_capacity_assets | ||
| .iter() | ||
| .filter_map(|asset| { | ||
| let agent_id = asset.agent_id().unwrap(); | ||
| let agent_share = *agent_share_cache | ||
| .entry(agent_id.clone()) | ||
| .or_insert_with(|| model.agents[agent_id].commodity_portions[&key]); | ||
| asset | ||
| .max_installable_capacity(agent_share) | ||
| .map(|max_capacity| (asset.clone(), max_capacity)) | ||
| }) |
There was a problem hiding this comment.
In cycle dispatch, capacity_limits uses key = (commodity_id.clone(), year) from the current market when looking up agent_share, but flexible_capacity_assets can include Selected assets from earlier markets with different primary commodities. This mis-scales max_installable_capacity for those assets (potentially clamping their capacity to near-zero or above the real limit) and can make cycle balancing infeasible. Compute the commodity portion per asset using the asset’s own primary output commodity (or equivalent process primary commodity) when indexing commodity_portions, and handle missing portions with a clear error instead of implicitly using the current market’s key.
alexdewar
left a comment
There was a problem hiding this comment.
We should also remove the note that the process_investment_constraints.csv file is unused from the docs.
I'm going to implement some of my own suggestions here then merge this, because it looks fine.
| regions: String, | ||
| commission_years: String, | ||
| addition_limit: f64, | ||
| addition_limit: Capacity, |
There was a problem hiding this comment.
I think this should actually be CapacityPerYear
| let scaled_limit = record.addition_limit * Dimensionless(years_since_prev as f64); | ||
|
|
||
| let constraint = Rc::new(ProcessInvestmentConstraint { | ||
| addition_limit: Some(scaled_limit), |
There was a problem hiding this comment.
@Aurashk Do you know if there was a particular motivation in making addition_limit optional? As far as I can see, all the limits that are None could just not be put into the map, with the same effect.
There was a problem hiding this comment.
Pull request overview
Copilot reviewed 7 out of 7 changed files in this pull request and generated 3 comments.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| // Retrieve installable capacity limits for flexible capacity assets. | ||
| let key = (commodity_id.clone(), year); | ||
| let mut agent_share_cache = HashMap::new(); | ||
| let capacity_limits = flexible_capacity_assets | ||
| .iter() | ||
| .filter_map(|asset| { |
There was a problem hiding this comment.
flexible_capacity_assets contains Selected assets from all previously visited markets in the cycle, but key = (commodity_id.clone(), year) uses only the current loop’s commodity. That means capacity limits/agent shares may be computed against the wrong commodity for many assets (and can panic if the agent has no commodity_portions entry for this commodity/year). Consider deriving the commodity key per asset (e.g., from the asset’s primary output commodity) instead of using the current market’s commodity_id.
| let agent_share = *agent_share_cache | ||
| .entry(agent_id.clone()) | ||
| .or_insert_with(|| model.agents[agent_id].commodity_portions[&key]); |
There was a problem hiding this comment.
agent_share_cache is keyed only by agent_id, but the commodity portion is commodity-specific. In a multi-commodity cycle, this will reuse the wrong share across different commodities (and the commodity_portions[&key] indexing can panic when the key is absent). Cache by (agent_id, commodity_id) and avoid direct indexing by using get() with an error/skip path.
| let agent_share = *agent_share_cache | |
| .entry(agent_id.clone()) | |
| .or_insert_with(|| model.agents[agent_id].commodity_portions[&key]); | |
| let cache_key = (agent_id.clone(), commodity_id.clone()); | |
| // Look up in cache first; if missing, safely query commodity_portions. | |
| let agent_share = if let Some(share) = agent_share_cache.get(&cache_key) { | |
| *share | |
| } else { | |
| let agent = &model.agents[agent_id]; | |
| let share = match agent.commodity_portions.get(&key) { | |
| Some(s) => *s, | |
| None => { | |
| // No defined share for this (agent, commodity, year); skip this asset. | |
| return None; | |
| } | |
| }; | |
| agent_share_cache.insert(cache_key, share); | |
| share | |
| }; |
There was a problem hiding this comment.
This isn't right. The commodity is constant over the region where the cache is used
| let lower = ((1.0 - capacity_margin) * cap.value()).max(0.0); | ||
| let upper = (1.0 + capacity_margin) * cap.value(); | ||
| let mut upper = (1.0 + capacity_margin) * cap.value(); | ||
| if let Some(limit) = capacity_limit { | ||
| upper = upper.min(limit.total_capacity().value()); | ||
| } |
There was a problem hiding this comment.
Applying capacity_limit can make upper smaller than lower (e.g., if a limit is below the asset’s current capacity), which creates an invalid bounds range for the solver. Add a check to ensure lower <= upper after applying limits (e.g., clamp lower to upper or return a structured error when the limit is incompatible with the current capacity).
There was a problem hiding this comment.
Hmm, that's a good point. I've got some additional concerns about this bit of code, but I'd like to merge this PR, so I've opened an issue (#1122) and we can revisit later.
|
I'm going to merge this now so it doesn't get out of sync with |
Description
Since #1020 we have a file allowing users to specify process investment constraints, but these were so far not being used.
In this PR, I've added a function
calculate_investment_limits_for_candidateswhich use values from this input file to cap the amount of investment allowed for each process. These limits are calculated per-agent based on the agent's share of the commodity demand and the number of years elapsed since the previous milestone year (see discussion in #124).The implementation was ultimately quite easy because we already have a
remaining_candidate_capacitymap to cap investments (currently based on demand limiting capacity), so we just needed to modify this map based on the provided investment limits (if any). An alternative would have been to modify thecapacityfield of candidate assets, but I decided against this as this would modify the tranching behaviour which I didn't want.Related things left to do:
Fixes #1069
Type of change
Key checklist
$ cargo test$ cargo docpresent in the previous release
Further checks