Skip to content

Peak load management heuristic control#641

Draft
jaredthomas68 wants to merge 41 commits intoNatLabRockies:developfrom
jaredthomas68:peakload
Draft

Peak load management heuristic control#641
jaredthomas68 wants to merge 41 commits intoNatLabRockies:developfrom
jaredthomas68:peakload

Conversation

@jaredthomas68
Copy link
Copy Markdown
Collaborator

Peak load management heuristic control

This PR adds Peak load management heuristic control to H2I. This does not do demand dispatch, but rather dispatches based on peaks in the provided load and rules defined by the user.

Section 1: Type of Contribution

  • Feature Enhancement
    • Framework
    • New Model
    • Updated Model
    • Tools/Utilities
    • Other (please describe):
  • Bug Fix
  • Documentation Update
  • CI Changes
  • Other (please describe):

Section 2: Draft PR Checklist

  • Open draft PR
  • Describe the feature that will be added
  • Fill out TODO list steps
  • Describe requested feedback from reviewers on draft PR
  • Complete Section 7: New Model Checklist (if applicable)

TODO:

  • Step 1
  • Step 2

Type of Reviewer Feedback Requested (on Draft PR)

I am primarily looking for high-level structural and implementation feedback at this point
Structural feedback:

Implementation feedback:

Other feedback:

Section 3: General PR Checklist

  • PR description thoroughly describes the new feature, bug fix, etc.
  • Added tests for new functionality or bug fixes
  • Tests pass (If not, and this is expected, please elaborate in the Section 6: Test Results)
  • Documentation
    • Docstrings are up-to-date
    • Related docs/ files are up-to-date, or added when necessary
    • Documentation has been rebuilt successfully
    • Examples have been updated (if applicable)
  • CHANGELOG.md
    • At least one complete sentence has been provided to describe the changes made in this PR
    • After the above, a hyperlink has been provided to the PR using the following format:
      "A complete thought. [PR XYZ]((https://github.com/NatLabRockies/H2Integrate/pull/XYZ)", where
      XYZ should be replaced with the actual number.

Section 3: Related Issues

Section 4: Impacted Areas of the Software

Section 4.1: New Files

  • path/to/file.extension
    • method1: What and why something was changed in one sentence or less.

Section 4.2: Modified Files

  • path/to/file.extension
    • method1: What and why something was changed in one sentence or less.

Section 5: Additional Supporting Information

Section 6: Test Results, if applicable

Section 7 (Optional): New Model Checklist

  • Model Structure:
    • Follows established naming conventions outlined in docs/developer_guide/coding_guidelines.md
    • Used attrs class to define the Config to load in attributes for the model
      • If applicable: inherit from BaseConfig or CostModelBaseConfig
    • Added: initialize() method, setup() method, compute() method
      • If applicable: inherit from CostModelBaseClass
  • Integration: Model has been properly integrated into H2Integrate
    • Added to supported_models.py
    • If a new commodity_type is added, update create_financial_model in h2integrate_model.py
  • Tests: Unit tests have been added for the new model
    • Pytest-style unit tests
    • Unit tests are in a "test" folder within the folder a new model was added to
    • If applicable add integration tests
  • Example: If applicable, a working example demonstrating the new model has been created
    • Input file comments
    • Run file comments
    • Example has been tested and runs successfully in test_all_examples.py
  • Documentation:
    • Write docstrings using the Google style
    • Model added to the main models list in docs/user_guide/model_overview.md
      • Model documentation page added to the appropriate docs/ section
      • <model_name>.md is added to the _toc.yml

@elenya-grant elenya-grant self-requested a review March 31, 2026 19:43
Copy link
Copy Markdown
Collaborator

@elenya-grant elenya-grant left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

just left some initial comments/questions - haven't done a deep dive yet (so some of my questions/comments may be silly or I'll be able to answer during a deep-dive) but plan to do a deeper review by Thursday morning. I only looked at the changes and additions to the control classes but will review the tests in the second review I do.

Overall looks like a great start - most of my comments were small or were questions!

# determine demand_profile peaks using defaults of daily peaks inside peak_range
# for the full simulation but respecting the peak range specified in the config
self.secondary_peaks_df = self.get_peaks(
demand_profile=self.condig.demand_profile,
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should this be inputs[f"{self.config.commodity}_demand"] instead of the demand from the config?

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some of the reasoning for this is in my comment here: #641 (comment). I guess I can split up demand and time stamp as separate inputs so we can use the input like the other controllers.

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have split up demand and date_time

@jaredthomas68 jaredthomas68 requested a review from vijay092 April 2, 2026 16:11
Copy link
Copy Markdown
Collaborator

@elenya-grant elenya-grant left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Howdy! I gave this more of a deeper look! I think I'm a little confused on how this method works (I didn't try to understand it super hard yet) - so most of my comments were nitpicks or questions. My only blocking comment is about the error being removed from load_plant_yaml - I don't think that error message should be removed at this time.

I think a visual (or two) may be nice to explain some of the inputs to the controller - I think if a doc page with some visuals and explanation on the inputs would be super helpful in making it easier for users to understand how to change the control input parameters based on their use-case.

dt_seconds = int(simulation_cfg["dt"])

# Optional start_time in config; default to a fixed reference timestamp.
start_time = simulation_cfg.get("start_time", "2000-01-01 00:00:00")
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the start-time format in the plant config is defined as being: mm/dd/yyyy HH:MM:SS or mm/dd HH:MM:SS and defaults to 01/01 00:30:00 (doesn't include a year because it was initially going to be used with resource data and the year may change based on the resource year). The format here does not match - do you think we could make sure that the format is consistent mm/dd/yyyy instead of yyyy-mm-dd?

I made a similar function when I was starting on the resource models (it never made it in) but it handles whether a year was added or not:

from datetime import datetime, timezone, timedelta

def make_time_profile(
    start_time: str,
    dt: float | int,
    n_timesteps: int,
    time_zone: int | float,
    start_year: int | None = None,
):
    """Generate a time-series profile for a given start time, time step interval, and
    number of timesteps, with a timezone signature.

    Args:
        start_time (str): simulation start time formatted as 'mm/dd/yyyy HH:MM:SS' or
            'mm/dd HH:MM:SS'
        dt (float | int): time step interval in seconds.
        n_timesteps (int): number of timesteps in a simulation.
        time_zone (int | float): timezone offset from UTC in hours.
        start_year (int | None, optional): year to use for start-time. if start-time
            is formatted as 'mm/dd/yyyy HH:MM:SS' then will overwrite original year.
            If None, the year will default to 1900 if start-time is formatted as 'mm/dd HH:MM:SS'.
            Defaults to None.

    Returns:
        list[datetime]: list of datetime objects that represents the time profile
    """

    tz_utc_offset = timedelta(hours=time_zone)
    tz = timezone(offset=tz_utc_offset)
    tz_str = str(tz).replace("UTC", "").replace(":", "")
    if tz_str == "":
        tz_str = "+0000"
    # timezone formatted as ±HHMM[SS[.ffffff]]
    start_time_w_tz = f"{start_time} ({tz_str})"
    if len(start_time.split("/")) == 3:
        if start_year is not None:
            start_time_month_day_year, start_time_time = start_time.split(" ")
            start_time_month_day = "/".join(i for i in start_time_month_day_year.split("/")[:-1])
            start_time_w_tz = f"{start_time_month_day}/{start_year} {start_time_time} ({tz_str})"

        t = datetime.strptime(start_time_w_tz, "%m/%d/%Y %H:%M:%S (%z)")
    elif len(start_time.split("/")) == 2:
        if start_year is not None:
            start_time_month_day, start_time_time = start_time.split(" ")
            start_time_w_tz = f"{start_time_month_day}/{start_year} {start_time_time} ({tz_str})"
            t = datetime.strptime(start_time_w_tz, "%m/%d/%Y %H:%M:%S (%z)")
        else:
            # NOTE: year will default to 1900
            t = datetime.strptime(start_time_w_tz, "%m/%d %H:%M:%S (%z)")
    time_profile = [None] * n_timesteps
    time_step = timedelta(seconds=dt)
    for i in range(n_timesteps):
        time_profile[i] = t
        t += time_step
    return time_profile

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the extended function. I personally much prefer the international format "yyyy-mm-dd", but I understand their being an existing approach. I went ahead and changed the function to handle timezone and added a function to make the time series with direct inputs rather than a config. I do not have much use for the second value of a timestamp, but I did adjust to return python datetime format. We can continue to discuss exact desired format, but I would prefer to make the timeseries in a standard date-time format and allow users and developers to adjust to lists of integers or whatever other format they need from there.

dt_seconds = int(simulation_cfg["dt"])

# Optional start_time in config; default to a fixed reference timestamp.
start_time = simulation_cfg.get("start_time", "2000-01-01 00:00:00")
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if the plant config is loaded using load_plant_yaml(), then the start time should always be included. Aka - I don't think we should have default values in both the modeling schema and this function. But - I'm happy to see a function like this get in!

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good feedback. I have added a default timezone and removed the default start_time in this function

@@ -0,0 +1,11 @@
name: plant_config
description: Demonstrates multivariable streams with a gas combiner
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

update description in plant config

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

commodity: electricity
commodity_rate_units: kW
max_charge_rate: 2500.0 # kW/time step, 1, 2.5, or 5 MW
max_capacity: 10000.0 # kWh, 80 MWh
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

comment for max_capacity is wrong, should say 10 MWh

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

max_supervisor_events: (int | None, optional): The maximum number of discharge events
allowed for the supervisor in the period specified in max_supervisor_event_period,
or across all time steps if max_supervisor_event_period is None.

Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

could you add in the other attributes to the doc string? Like peak_range, advance_discharge_period, delay_charge_period, allow_charge_in_peak_range, and min_peak_proximity?

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

},
)

def __attrs_post_init__(self):
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should the dictionary inputs be checked I the __attrs_post_init__ method to check that they have the right keys?

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done


self.get_allowed_discharge()

@staticmethod
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why is this a staticemethod rather than just a normal method? (same with _normalize_peak_range?)

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These are static methods because they need to be called with different attributes of the class rather than the exact same attributes in order each time. This also means the output does not have a consistent target.

# Dispatch strategy outline:
# - Discharge: Starting when time_to_peak <= advance_discharge_period
# * Discharge at max rate (or less to reach targets)
# * Stop discharging only when SOC reaches min_soc
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

could these inline comments get moved closer to where that logic is represented in the code?

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I moved them to the docstring as I think that makes more sense.


This method applies an open-loop storage control strategy to balance the
commodity demand and input flow. When input exceeds demand, excess commodity
is used to charge storage (subject to rate, efficiency, and SOC limits). When
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The description of this compute method makes it seem really similar to the DemandOpenLoopStorageController and the HeuristicLoadFollowingControl - could you update the doctoring to explain the peak-shaving novelty of this?

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

@vijay092
Copy link
Copy Markdown
Collaborator

vijay092 commented Apr 2, 2026

Very exciting work @jaredthomas68! Thanks for putting this together in such a short time! I did a full pass through h2integrate/control/control_strategies/storage/plm_openloop_storage_controller.py and left comments wherever I spotted small issues. Feel free to address them as you see fit.

dispatch_priority_demand_profile: str = field(
validator=contains(["demand_profile", "demand_profile_supervisor"]),
)
max_supervisor_events: int | None = (field(default=None),)
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this supposed to be a tuple?

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nope, thanks for the catch. Fixed

charge_efficiency: float | None = field(default=None, validator=range_val_or_none(0, 1))
discharge_efficiency: float | None = field(default=None, validator=range_val_or_none(0, 1))
round_trip_efficiency: float | None = field(default=None, validator=range_val_or_none(0, 1))
demand_profile_supervisor: int | float | list | None = field()
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

default = None

Copy link
Copy Markdown
Collaborator Author

@jaredthomas68 jaredthomas68 Apr 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I intentionally left off the default because I want the user to be very aware of how they are using this controller and if they are excluding a supervisory demand profile or not.


self.max_discharge_rate = self.max_charge_rate

# make sure peak_range is in correct format because yaml
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same problem for advance_discharge_period, right?

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, advance_discharge_period uses a unit, val paradigm instead of time stamps.

)

# Store simulation parameters for later use
self.dt = self.options["plant_config"]["plant"]["simulation"]["dt"]
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Never used.

Copy link
Copy Markdown
Collaborator Author

@jaredthomas68 jaredthomas68 Apr 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks. Removed


# Store simulation parameters for later use
self.dt = self.options["plant_config"]["plant"]["simulation"]["dt"]
self.time_index = build_time_series_from_plant_config(self.options["plant_config"])
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it is worth adding a length check against self.n_timesteps somewhere.

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think so. The time series builds on the same config that self.n_timesteps comes from, so they should be the same by default.

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I went ahead and added a check just in case. Won't hurt.

day_df = supervisory_peaks_df[
supervisory_peaks_df["date_time"].dt.floor("D") == day
]
# If supervisor has peaks on the day, use supervisor's flags for all rows that day
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good to add check for when supervisor is None.

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Supervisor should never be none in this function because the first argument is always treated as the most important peaks. I changed the naming and doc string to make this more clear.

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I also added the check you suggested

next_peak_time - self.peaks_df.loc[idx, "date_time"]
)

def get_allowed_discharge(self):
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Method name is misleading. It actually computes "allow_charge"?

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great point. I've changed the name.

soc_array[i] = deepcopy(soc)

# stay in discharge mode until the battery is fully discharged
if soc <= soc_min:
Copy link
Copy Markdown
Collaborator

@vijay092 vijay092 Apr 2, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Note for future:
discharging is only set to False when soc <= soc_min. If the battery doesn't fully drain during the event duration, discharging will continue to stay True

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That is by design. The intended operation is to fully charge and then fully discharge, not try to meet a demand, so the battery should fully discharge. If you have suggestions for catching corner cases on this I'm all ears.

Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this should be documented in the docpage on the peak load management, I would have expected the battery to stop discharging once the event is over.

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The event period is pretty loose. We find the exact peak, but discharge leading up to and after that peak. I will include more in a docs page.

# start discharging when we approach a peak and have some charge
if time_to_peak <= advance_discharge_period and soc > soc_min:
discharging = True

Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggest adding charging = False here in case charging hasn't been set to False in the previous timestep.

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done

# Note: discharge_needed is internal (storage view), max_discharge_rate is external
discharge_needed = max_discharge_rate / discharge_eff
discharge = min(
discharge_needed, available_discharge, max_discharge_rate / discharge_eff
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The first and third terms are the same.

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, I was trying to lean into code reuse and hoping I could find a good way, but I went ahead and removed the duplicate.

Copy link
Copy Markdown
Collaborator

@genevievestarke genevievestarke left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Really great PR, @jaredthomas68!
I agree with other comments that a docs page would be nice for users, but I know that's usually one of the last things in a PR!

The main blocking comment I have is the setting of the charging and discharging variables. Let me know if you want to discuss!

commodity (str): Name of the commodity being controlled
(e.g., "hydrogen"). Stripped of whitespace.
commodity_rate_units (str): Units of the commodity (e.g., "kg/h").
demand_profile (int | float | list): Demand values for each timestep, in
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe these first three are included in the base class, so they aren't usually included in the inherited class.

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Removed

discharging the storage, represented as a decimal between 0 and 1 (e.g., 0.81 for
81% efficiency). Optional if `charge_efficiency` and `discharge_efficiency` are
provided.
commodity_amount_units (str | None, optional): Units of the commodity as an amount
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same comment here (already in base class)

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Removed

provided.
commodity_amount_units (str | None, optional): Units of the commodity as an amount
(i.e., kW*h or kg). If not provided, defaults to commodity_rate_units*h.
demand_profile_supervisor (int | float | list | None, optional): Demand values for
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think the name of the variable here matches the doc string. Is it specifically for a supervising entity?

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Names have been changed and unified


"""

max_capacity: float = field()
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Many of these parameters are the same as those in DemandOpenLoopStorageController. Could this class inherit from that class?

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Possibly, but only in terms of parameters. Thoughts on this @johnjasa ?

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I added these repeat parameters to the base class as optional based on the value of a class flag that child classes need to set to True for the additional parameters to be required.


# Detect daily peaks in secondary demand profile (always computed)
# Respects the configured peak_range time window for each day
self.secondary_peaks_df = self.get_peaks(
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree! Secondary peaks being the only peaks object that is always computed is a bit confusing. I think this point it true throughout this section, where secondary_demand_profile is actually the main profile.

demand_df.loc[daily_peak_idx, "is_peak"] = True

# Optional: Limit number of peaks globally or per period
if n_max_events is not None:
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can n_max_events be zero, or should it be None if it's zero?

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

n_max_events set to zero would mean the controller does nothing and the battery will never be used for the case when only one load is provided. When two loads are provided then the supervisory load would have no impact if n_max_events is zero.

def merge_peaks(supervisory_peaks_df, secondary_peaks_df):
"""Merge supervisor and secondary peak schedules with supervisor precedence.

Combines two peak schedules (primary and fallback) using day-level precedence:
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think the wording could be "original and override" maybe? Just trying to head off confusion about secondary being default behavior.

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I changed the names in this method to peaks_1 and peaks_2 since they don't always correspond to the outer scope.

}

@staticmethod
def _normalize_peak_range(peak_range):
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How does this normalize the peak range?

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It does not. The normalize was more of get in the normal format for computations. Bad naming. I changed it to _parse_peak_range

if np.all(inputs[f"{commodity}_demand"] == 0.0):
msg = "Demand profile is zero, check that demand profile is input"
raise UserWarning(msg)
if inputs["max_charge_rate"][0] < 0:
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this repeated code?

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, it is the same as in DemandOpenLoopStorageController. I originally wanted to inherit, but I'm not sure there is sufficient code reuse for that. I'll give it another go.

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I also wanted to limit inheritance depth

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have moved these checks to StorageOpenLoopControlBase

if not discharging and soc < soc_max:
if self.peaks_df["allow_charge"].iloc[i]:
if (td - last_discharge) > delay_charge_period:
charging = True
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think discharging = False could also be added here

Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can discharging and charging be False at the same time? How is that set?

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good question. The not discharging or charging is handled by rest of the logic. For example, if the battery is fully charged and we are not at a peak, then the battery does nothing.

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The flags are more of what action could be taken in order to make sure the battery fully discharges before charging and fully charges before discharging.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants