Note

You can download this example as a Jupyter notebook or try it out directly in Google Colab.

11. Redispatch modelling using PyPSA#

This tutorial demonstrates modelling and simulation of redispatch mechanism using PyPSA as a plug and play module in ASSUME-framework. The model will be created mainly taking grid constraints into consideration to identify grid bottlenecks with dispatches from EOM and resolve them using the redispatch algorithm.

Concept of Redispatch#

The locational mismatch in demand and generation of electricity needs transmission of electricity from low demand regions to high demand regions. The transmission capacity limits the maximum amounts of electricity which can be transmitted at any point in time. If there is no enough capacity to transmit the required amount of electricity then there is a need of ramping down of generation at the locations of low demand and ramping up of generation at the locations of higher demand. This is typically called as Redispatch. Apart from spot markets there is redispatch mechanism to regulate this grid flows to avoid congestion issues. It is operated and controlled by the System operators (SO).

Objective#

The aim of redispatch is to reduce the overall cost of Redispatch(starting up, shutting down, ramping up, ramping down).

Structure in Redispatch model#

  • The redispatch has following structure:

    1. Ramping up of reserved powerplants:

    2. Ramping up of market powerplants

    3. Ramping down of market powerplants:

    4. Ramping up/down of other flexibilites:

Objective of This Tutorial:#

In this tutorial, we will: 1. Set up a 2-node example of redispatch. 2. Connect hypothetical generators,loads and transmission lines to illustrate flow of energy. 3. Add demand_side_units to analyse their impact on overall redispatch. 4. Simulate and visualize the results.

Setting up grid network with infrastructure#

The grid infrastructure includes mainly three components:

  • Generators: Used to produce hydrogen for steel production.

  • Loads: Directly reduces iron ore using hydrogen.

  • Transmission grid: Converts the reduced iron into steel.

Here the components are defined with their operational constraints (such as power, efficiency, ramp rates etc.)

1. Loads csv files from the given path and returns a dataframe#

[ ]:
import os
import matplotlib.pyplot as plt
import numpy as np
import pandas as pd
import plotly.graph_objects as go
import pyomo as pyo
import seaborn as sns
import yaml
import pypsa
from assume import World
from pathlib import Path

# Simplified function to add read required CSV files
def read_grid(network_path: str | Path) -> dict[str, pd.DataFrame]:
    network_path = Path(network_path)
    buses = pd.read_csv(network_path / "buses.csv", index_col=0)
    lines = pd.read_csv(network_path / "lines.csv", index_col=0)
    generators = pd.read_csv(network_path / "powerplant_units.csv", index_col=0)
    loads = pd.read_csv(network_path / "demand_units.csv", index_col=0)

    return {
        "buses": buses,
        "lines": lines,
        "generators": generators,
        "loads": loads,
    }

2. Simplified function to add generators to the grid network#

[ ]:
# Simplified function to add generators to the grid network
def add_generators(
    network: pypsa.Network,
    generators: pd.DataFrame,
) -> None:
    """
    Add generators normally to the grid

    Args:
        network (pypsa.Network): the pypsa network to which the generators are
        generators (pandas.DataFrame): the generators dataframe
    """
    p_set = pd.DataFrame(
        np.zeros((len(network.snapshots), len(generators.index))),
        index=network.snapshots,
        columns=generators.index,
    )
    # add generators
    network.madd(
        "Generator",
        names=generators.index,
        bus=generators["node"],  # bus to which the generator is connected to
        p_nom=generators["max_power"],  # Nominal capacity of the powerplant/generator
        p_min_pu=p_set,
        p_max_pu=p_set + 1,
        marginal_cost=p_set,
        **generators,
    )

3. Simplified function to add loads to the grid network#

[ ]:
# Simplified function to add loads to the grid network
def add_loads(
    network: pypsa.Network,
    loads: pd.DataFrame,
) -> None:
    """
    Add loads normally to the grid

    Args:
        network (pypsa.Network): the pypsa network to which the loads are
        loads (pandas.DataFrame): the loads dataframe
    """

    # add loads
    network.madd(
        "Load",
        names=loads.index,
        bus=loads["node"],  # bus to which the generator is connected to
        **loads,
    )

    if "p_set" not in loads.columns:
        network.loads_t["p_set"] = pd.DataFrame(
            np.zeros((len(network.snapshots), len(loads.index))),
            index=network.snapshots,
            columns=loads.index,
        )

4. Simplified function to add loads to the redispatch network#

[ ]:
# Simplified function to add loads to the redispatch network
def add_redispatch_loads(
    network: pypsa.Network,
    loads: pd.DataFrame,
) -> None:
    """
    This adds loads to the redispatch PyPSA network with respective bus data to which they are connected
    """
    loads_c = loads.copy()
    if "sign" in loads_c.columns:
        del loads_c["sign"]

    # add loads with opposite sign (default for loads is -1). This is needed to properly model the redispatch
    network.madd(
        "Load",
        names=loads.index,
        bus=loads["node"],  # bus to which the generator is connected to
        sign=1,
        **loads_c,
    )

    if "p_set" not in loads.columns:
        network.loads_t["p_set"] = pd.DataFrame(
            np.zeros((len(network.snapshots), len(loads.index))),
            index=network.snapshots,
            columns=loads.index,
        )

5. Simplified function to add Buses and Lines to the redispatch network#

[ ]:
# Simplified function to add grid buses and lines to the redispatch network
def read_pypsa_grid(
    network: pypsa.Network,
    grid_dict: dict[str, pd.DataFrame],
):
    """
    Generates the pypsa grid from a grid dictionary.
    Does not add the generators, as they are added in different ways, depending on wether redispatch is used.

    Args:
        network (pypsa.Network): the pypsa network to which the components will be added
        grid_dict (dict[str, pd.DataFrame]): the dictionary containing dataframes for generators, loads, buses and links
    """

    def add_buses(network: pypsa.Network, buses: pd.DataFrame) -> None:
        network.import_components_from_dataframe(buses, "Bus")

    def add_lines(network: pypsa.Network, lines: pd.DataFrame) -> None:
        network.import_components_from_dataframe(lines, "Line")

    # setup the network
    add_buses(network, grid_dict["buses"])
    add_lines(network, grid_dict["lines"])
    return network

6. Congestion/Redispatch clearning function#

Performs redispatch to resolve congestion in the electricity market. - It first checks for congestion in the network and if it finds any, it performs redispatch to resolve it. - The returned orderbook contains accepted orders with the redispatched volumes and prices. - The prices are positive for upward redispatch and negative for downward redispatch.

[ ]:
from assume.common.market_objects import MarketConfig, Orderbook

def clear(
    self, orderbook: Orderbook, market_products
) -> tuple[Orderbook, Orderbook, list[dict]]:

    orderbook_df = pd.DataFrame(orderbook)
    orderbook_df["accepted_volume"] = 0.0
    orderbook_df["accepted_price"] = 0.0

    # Now you can pivot the DataFrame
    volume_pivot = orderbook_df.pivot(
        index="start_time", columns="unit_id", values="volume"
    )
    max_power_pivot = orderbook_df.pivot(
        index="start_time", columns="unit_id", values="max_power"
    )
    min_power_pivot = orderbook_df.pivot(
        index="start_time", columns="unit_id", values="min_power"
    )
    price_pivot = orderbook_df.pivot(
        index="start_time", columns="unit_id", values="price"
    )

    # Calculate p_set, p_max_pu_up, and p_max_pu_down directly using DataFrame operations
    p_set = volume_pivot

    # Calculate p_max_pu_up as difference between max_power and accepted volume
    p_max_pu_up = (max_power_pivot - volume_pivot).div(
        max_power_pivot.where(max_power_pivot != 0, np.inf)
    )

    # Calculate p_max_pu_down as difference between accepted volume and min_power
    p_max_pu_down = (volume_pivot - min_power_pivot).div(
        max_power_pivot.where(max_power_pivot != 0, np.inf)
    )
    p_max_pu_down = p_max_pu_down.clip(lower=0)  # Ensure no negative values

    # Determine the costs directly from the price pivot
    costs = price_pivot

    # Drop units with only negative volumes (if necessary)
    negative_only_units = volume_pivot.lt(0).all()
    p_max_pu_up = p_max_pu_up.drop(
        columns=negative_only_units.index[negative_only_units]
    )
    p_max_pu_down = p_max_pu_down.drop(
        columns=negative_only_units.index[negative_only_units]
    )
    costs = costs.drop(columns=negative_only_units.index[negative_only_units])

    # reset indexes for all dataframes
    p_set.reset_index(inplace=True, drop=True)
    p_max_pu_up.reset_index(inplace=True, drop=True)
    p_max_pu_down.reset_index(inplace=True, drop=True)
    costs.reset_index(inplace=True, drop=True)

    # Update the network parameters
    redispatch_network = self.network.copy()
    redispatch_network.loads_t.p_set = p_set

    # Update p_max_pu for generators with _up and _down suffixes
    redispatch_network.generators_t.p_max_pu.update(p_max_pu_up.add_suffix("_up"))
    redispatch_network.generators_t.p_max_pu.update(
        p_max_pu_down.add_suffix("_down")
    )

    # Add _up and _down suffix to costs and update the network
    redispatch_network.generators_t.marginal_cost.update(costs.add_suffix("_up"))
    redispatch_network.generators_t.marginal_cost.update(
        costs.add_suffix("_down") * (-1)
    )

    # run linear powerflow
    redispatch_network.lpf()

    # check lines for congestion where power flow is larget than s_nom
    line_loading = (
        redispatch_network.lines_t.p0.abs() / redispatch_network.lines.s_nom
    )

    # if any line is congested, perform redispatch
    if line_loading.max().max() > 1:
        log.debug("Congestion detected")

        status, termination_condition = redispatch_network.optimize(
            solver_name=self.solver,
            env=self.env,
        )

        if status != "ok":
            log.error(f"Solver exited with {termination_condition}")
            raise Exception("Solver in redispatch market did not converge")

        # process dispatch data
        self.process_dispatch_data(
            network=redispatch_network, orderbook_df=orderbook_df
        )

    # if no congestion is detected set accepted volume and price to 0
    else:
        log.debug("No congestion detected")

    # return orderbook_df back to orderbook format as list of dicts
    accepted_orders = orderbook_df.to_dict("records")
    rejected_orders = []
    meta = []

    # calculate meta data such as total upwared and downward redispatch, total backup dispatch
    # and total redispatch cost
    for i, product in enumerate(market_products):
        meta.extend(
            calculate_network_meta(network=redispatch_network, product=product, i=i)
        )

    return accepted_orders, rejected_orders, meta