Examples

A number of examples of how to use Nempy are provided below. Examples 1 to 5 are simple and aim introduce various market features that can be modelled with Nempy in an easy to understand way, the dispatch and pricing outcomes are explained in inline comments where the results are printed. Examples 6 and 7 show how to use the historical data input preparation tools provided with Nempy to recreate historical dispatch intervals. Historical dispatch and pricing outcomes can be difficult to interpret as they are usually the result of complex interactions between the many features of the dispatch process, for these example the results are plotted in comparison to historical price outcomes. Example 8 demonstrates how the outputs of one dispatch interval can be used as the initial conditions of the next dispatch interval to create a time sequential model, additionally the current limitations with the approach are briefly discussed.

1. Bid stack equivalent market

This example implements a one region bid stack model of an electricity market. Under the bid stack model, generators are dispatched according to their bid prices, from cheapest to most expensive, until all demand is satisfied. No loss factors, ramping constraints or other factors are considered.

 1import pandas as pd
 2from nempy import markets
 3
 4# Volume of each bid, number of bands must equal number of bands in price_bids.
 5volume_bids = pd.DataFrame({
 6    'unit': ['A', 'B'],
 7    '1': [20.0, 50.0],  # MW
 8    '2': [20.0, 30.0],  # MW
 9    '3': [5.0, 10.0]  # More bid bands could be added.
10})
11
12# Price of each bid, bids must be monotonically increasing.
13price_bids = pd.DataFrame({
14    'unit': ['A', 'B'],
15    '1': [50.0, 50.0],  # $/MW
16    '2': [60.0, 55.0],  # $/MW
17    '3': [100.0, 80.0]  # . . .
18})
19
20# Other unit properties
21unit_info = pd.DataFrame({
22    'unit': ['A', 'B'],
23    'region': ['NSW', 'NSW'],  # MW
24})
25
26# The demand in the region\s being dispatched
27demand = pd.DataFrame({
28    'region': ['NSW'],
29    'demand': [115.0]  # MW
30})
31
32# Create the market model
33market = markets.SpotMarket(unit_info=unit_info, market_regions=['NSW'])
34market.set_unit_volume_bids(volume_bids)
35market.set_unit_price_bids(price_bids)
36market.set_demand_constraints(demand)
37
38# Calculate dispatch and pricing
39market.dispatch()
40
41# Return the total dispatch of each unit in MW.
42print(market.get_unit_dispatch())
43#   unit service  dispatch
44# 0    A  energy      40.0
45# 1    B  energy      75.0
46
47# Understanding the dispatch results: Unit A's first bid is 20 MW at 50 $/MW,
48# and unit B's first bid is 50 MW at 50 $/MW, as demand for electricity is
49# 115 MW both these bids are need to meet demand and so both will be fully
50# dispatched. The next cheapest bid is 30 MW at 55 $/MW from unit B, combining
51# this with the first two bids we get 100 MW of generation, so all of this bid
52# will be dispatched. The next cheapest bid is 20 MW at 60 $/MW from unit A, by
53# dispatching 15 MW of this bid we get a total of 115 MW generation, and supply
54# meets demand so no more bids need to be dispatched. Adding up the dispatched
55# bids from each generator we can see that unit A will be dispatch for 40 MW
56# and unit B will be dispatch for 75 MW, as given by our bid stack market model.
57
58# Return the price of energy in each region.
59print(market.get_energy_prices())
60#   region  price
61# 0    NSW   60.0
62
63# Understanding the pricing result: In this case the marginal bid, the bid
64# that would be dispatch if demand increased is the 60 $/MW bid from unit
65# B, thus this bid sets the price.
66
67# Additional Detail: The above is a simplified interpretation
68# of the pricing result, note that the price is actually taken from the
69# underlying linear problem's shadow price for the supply equals demand constraint.
70# The way the problem is formulated if supply sits exactly between two bids,
71# for example at 120.0 MW, then the price is set by the lower rather
72# than the higher bid. Note, in practical use cases if the demand is a floating point
73# number this situation is unlikely to occur.

2. Unit loss factors, capacities and ramp rates

A simple example with two units in a one region market, units are given loss factors, capacity values and ramp rates. The effects of loss factors on dispatch and market prices are explained.

 1import pandas as pd
 2from nempy import markets
 3
 4# Volume of each bid, number of bands must equal number of bands in price_bids.
 5volume_bids = pd.DataFrame({
 6    'unit': ['A', 'B'],
 7    '1': [20.0, 50.0],  # MW
 8    '2': [25.0, 30.0],  # MW
 9    '3': [5.0, 10.0]  # More bid bands could be added.
10})
11
12# Price of each bid, bids must be monotonically increasing.
13price_bids = pd.DataFrame({
14    'unit': ['A', 'B'],
15    '1': [40.0, 50.0],  # $/MW
16    '2': [60.0, 55.0],  # $/MW
17    '3': [100.0, 80.0]  # . . .
18})
19
20# Factors limiting unit output.
21unit_limits = pd.DataFrame({
22    'unit': ['A', 'B'],
23    'initial_output': [0.0, 0.0],  # MW
24    'capacity': [55.0, 90.0],  # MW
25    'ramp_up_rate': [600.0, 720.0],  # MW/h
26    'ramp_down_rate': [600.0, 720.0]  # MW/h
27})
28
29# Other unit properties including loss factors.
30unit_info = pd.DataFrame({
31    'unit': ['A', 'B'],
32    'region': ['NSW', 'NSW'],  # MW
33    'loss_factor': [0.9, 0.95]
34})
35
36# The demand in the region\s being dispatched.
37demand = pd.DataFrame({
38    'region': ['NSW'],
39    'demand': [100.0]  # MW
40})
41
42# Create the market model
43market = markets.SpotMarket(unit_info=unit_info,
44                            market_regions=['NSW'])
45market.set_unit_volume_bids(volume_bids)
46market.set_unit_price_bids(price_bids)
47market.set_unit_bid_capacity_constraints(
48    unit_limits.loc[:, ['unit', 'capacity']])
49market.set_unit_ramp_up_constraints(
50    unit_limits.loc[:, ['unit', 'initial_output', 'ramp_up_rate']])
51market.set_unit_ramp_down_constraints(
52    unit_limits.loc[:, ['unit', 'initial_output', 'ramp_down_rate']])
53market.set_demand_constraints(demand)
54
55# Calculate dispatch and pricing
56market.dispatch()
57
58# Return the total dispatch of each unit in MW.
59print(market.get_unit_dispatch())
60#   unit service  dispatch
61# 0    A  energy      40.0
62# 1    B  energy      60.0
63
64# Understanding the dispatch results: In this example unit loss factors are
65# provided, that means the cost of a bid in the dispatch optimisation is
66# the bid price divided by the unit loss factor. However, loss factors do
67# not effect the amount of generation a unit can supply, this is because the
68# regional demand already factors in intra regional losses. The cheapest bid is
69# from unit A with 20 MW at 44.44 $/MW (after loss factor), this will be
70# fully dispatched. The next cheapest bid is from unit B with 50 MW at
71# 52.63 $/MW, again fully dispatch. The next cheapest is unit B with 30 MW at
72# 57.89 $/MW, however, unit B starts the interval at a dispatch level of 0.0 MW
73# and can ramp at speed of 720 MW/hr, the default dispatch interval of Nempy
74# is 5 min, so unit B can at most produce 60 MW by the end of the
75# dispatch interval, this means only 10 MW of the second bid from unit B can be
76# dispatched. Finally, the last bid that needs to be dispatch for supply to
77# equal demand is from unit A with 25 MW at 66.67 $/MW, only 20 MW of this
78# bid is needed. Adding together the bids from each unit we can see that
79# unit A is dispatch for a total of 40 MW and unit B for a total of 60 MW.
80
81# Return the price of energy in each region.
82print(market.get_energy_prices())
83#   region  price
84# 0    NSW  66.67
85
86# Understanding the pricing result: In this case the marginal bid, the bid
87# that would be dispatch if demand increased is the second bid from unit A,
88# after adjusting for the loss factor this bid has a price of 66.67 $/MW bid,
89# and this bid sets the price.

3. Interconnector with losses

A simple example demonstrating how to implement a two region market with an interconnector. The interconnector is modelled simply, with a fixed percentage of losses. To make the interconnector flow and loss calculation easy to understand a single unit is modelled in the NSW region, NSW demand is set zero, and VIC region demand is set to 90 MW, thus all the power to meet VIC demand must flow across the interconnetcor.

  1import pandas as pd
  2from nempy import markets
  3
  4# The only generator is located in NSW.
  5unit_info = pd.DataFrame({
  6    'unit': ['A'],
  7    'region': ['NSW']  # MW
  8})
  9
 10# Create a market instance.
 11market = markets.SpotMarket(unit_info=unit_info, market_regions=['NSW', 'VIC'])
 12
 13# Volume of each bids.
 14volume_bids = pd.DataFrame({
 15    'unit': ['A'],
 16    '1': [100.0]  # MW
 17})
 18
 19market.set_unit_volume_bids(volume_bids)
 20
 21# Price of each bid.
 22price_bids = pd.DataFrame({
 23    'unit': ['A'],
 24    '1': [50.0]  # $/MW
 25})
 26
 27market.set_unit_price_bids(price_bids)
 28
 29# NSW has no demand but VIC has 90 MW.
 30demand = pd.DataFrame({
 31    'region': ['NSW', 'VIC'],
 32    'demand': [0.0, 90.0]  # MW
 33})
 34
 35market.set_demand_constraints(demand)
 36
 37# There is one interconnector between NSW and VIC. Its nominal direction is towards VIC.
 38interconnectors = pd.DataFrame({
 39    'interconnector': ['little_link'],
 40    'to_region': ['VIC'],
 41    'from_region': ['NSW'],
 42    'max': [100.0],
 43    'min': [-120.0]
 44})
 45
 46market.set_interconnectors(interconnectors)
 47
 48
 49# The interconnector loss function. In this case losses are always 5 % of line flow.
 50def constant_losses(flow):
 51    return abs(flow) * 0.05
 52
 53
 54# The loss function on a per interconnector basis. Also details how the losses should be proportioned to the
 55# connected regions.
 56loss_functions = pd.DataFrame({
 57    'interconnector': ['little_link'],
 58    'from_region_loss_share': [0.5],  # losses are shared equally.
 59    'loss_function': [constant_losses]
 60})
 61
 62# The points to linearly interpolate the loss function between. In this example the loss function is linear so only
 63# three points are needed, but if a non linear loss function was used then more points would be better.
 64interpolation_break_points = pd.DataFrame({
 65    'interconnector': ['little_link', 'little_link', 'little_link'],
 66    'loss_segment': [1, 2, 3],
 67    'break_point': [-120.0, 0.0, 100]
 68})
 69
 70market.set_interconnector_losses(loss_functions, interpolation_break_points)
 71
 72# Calculate dispatch.
 73market.dispatch()
 74
 75# Return interconnector flow and losses.
 76print(market.get_interconnector_flows())
 77#   interconnector       flow    losses
 78# 0    little_link  92.307692  4.615385
 79
 80# Understanding the interconnector flows: Losses are modelled as extra demand
 81# in the regions on either side of the interconnector, in this case the losses
 82# are split evenly between the regions. All demand in VIC must be supplied
 83# across the interconnector, and losses in VIC will add to the interconnector
 84# flow required, so we can write the equation:
 85#
 86# flow = vic_demand + flow * loss_pct_vic
 87# flow - flow * loss_pct_vic = vic_demand
 88# flow * (1 - loss_pct_vic) = vic_demand
 89# flow = vic_demand / (1 - loss_pct_vic)
 90#
 91# Since interconnector losses are 5% and half occur in VIC the
 92# loss_pct_vic = 2.5%. Thus:
 93#
 94# flow = 90 / (1 - 0.025) = 92.31
 95#
 96# Knowing the interconnector flow we can work out total losses
 97#
 98# losses = flow * loss_pct
 99# losses = 92.31 * 0.05 = 4.62
100
101# Return the total dispatch of each unit in MW.
102print(market.get_unit_dispatch())
103#   unit service   dispatch
104# 0    A  energy  94.615385
105
106# Understanding dispatch results: Unit A must be dispatch to
107# meet supply in VIC plus all interconnector losses, therefore
108# dispatch is 90.0 + 4.62 = 94.62.
109
110# Return the price of energy in each region.
111print(market.get_energy_prices())
112#   region      price
113# 0    NSW  50.000000
114# 1    VIC  52.564103
115
116# Understanding pricing results: The marginal cost of supply in NSW is simply
117# the cost of unit A's bid. However, the marginal cost of supply in VIC also
118# includes the cost of paying for interconnector losses.

4. Dynamic non-linear interconnector losses

This example demonstrates how to model regional demand dependant interconnector loss functions as decribed in the AEMO Marginal Loss Factors documentation section 3 to 5. To make the interconnector flow and loss calculation easy to understand a single unit is modelled in the NSW region, NSW demand is set zero, and VIC region demand is set to 800 MW, thus all the power to meet VIC demand must flow across the interconnetcor.

  1import pandas as pd
  2from nempy import markets
  3from nempy.historical_inputs import interconnectors as interconnector_inputs
  4
  5
  6# The only generator is located in NSW.
  7unit_info = pd.DataFrame({
  8    'unit': ['A'],
  9    'region': ['NSW']  # MW
 10})
 11
 12# Create a market instance.
 13market = markets.SpotMarket(unit_info=unit_info,
 14                            market_regions=['NSW', 'VIC'])
 15
 16# Volume of each bids.
 17volume_bids = pd.DataFrame({
 18    'unit': ['A'],
 19    '1': [1000.0]  # MW
 20})
 21
 22market.set_unit_volume_bids(volume_bids)
 23
 24# Price of each bid.
 25price_bids = pd.DataFrame({
 26    'unit': ['A'],
 27    '1': [50.0]  # $/MW
 28})
 29
 30market.set_unit_price_bids(price_bids)
 31
 32# NSW has no demand but VIC has 800 MW.
 33demand = pd.DataFrame({
 34    'region': ['NSW', 'VIC'],
 35    'demand': [0.0, 800.0],  # MW
 36    'loss_function_demand': [0.0, 800.0]  # MW
 37})
 38
 39market.set_demand_constraints(demand.loc[:, ['region', 'demand']])
 40
 41# There is one interconnector between NSW and VIC.
 42# Its nominal direction is towards VIC.
 43interconnectors = pd.DataFrame({
 44    'interconnector': ['VIC1-NSW1'],
 45    'to_region': ['VIC'],
 46    'from_region': ['NSW'],
 47    'max': [1000.0],
 48    'min': [-1200.0]
 49})
 50
 51market.set_interconnectors(interconnectors)
 52
 53# Create a demand dependent loss function.
 54# Specify the demand dependency
 55demand_coefficients = pd.DataFrame({
 56    'interconnector': ['VIC1-NSW1', 'VIC1-NSW1'],
 57    'region': ['NSW1', 'VIC1'],
 58    'demand_coefficient': [0.000021734, -0.000031523]})
 59
 60# Specify the loss function constant and flow coefficient.
 61interconnector_coefficients = pd.DataFrame({
 62    'interconnector': ['VIC1-NSW1'],
 63    'loss_constant': [1.0657],
 64    'flow_coefficient': [0.00017027],
 65    'from_region_loss_share': [0.5]})
 66
 67# Create loss functions on per interconnector basis.
 68loss_functions = interconnector_inputs.create_loss_functions(
 69    interconnector_coefficients, demand_coefficients,
 70    demand.loc[:, ['region', 'loss_function_demand']])
 71
 72# The points to linearly interpolate the loss function between.
 73interpolation_break_points = pd.DataFrame({
 74    'interconnector': 'VIC1-NSW1',
 75    'loss_segment': [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12],
 76    'break_point': [-1200.0, -1000.0, -800.0, -600.0, -400.0, -200.0,
 77                    0.0, 200.0, 400.0, 600.0, 800.0, 1000]
 78})
 79
 80market.set_interconnector_losses(loss_functions,
 81                                 interpolation_break_points)
 82
 83# Calculate dispatch.
 84market.dispatch()
 85
 86# Return interconnector flow and losses.
 87print(market.get_interconnector_flows())
 88#   interconnector        flow      losses
 89# 0      VIC1-NSW1  860.102737  120.205473
 90
 91# Understanding the interconnector flows: In this case it is not simple to
 92# analytically derive and explain the interconnector flow result. The loss
 93# model is constructed within the underlying mixed integer linear problem
 94# as set of constraints and the interconnector flow and losses are
 95# determined as part of the problem solution. However, the loss model can
 96# be explained at a high level, and the results shown to be consistent. The
 97# first step in the interconnector model is to drive the loss function as a
 98# function  of regional demand, which is a pre-market model creation step, the
 99# mathematics is explained in
100# docs/pdfs/Marginal Loss Factors for the 2020-21 Financial year.pdf. The loss
101# function is then evaluated at the given break points and linearly interpolated
102# between those points in the market model. So for our model the losses are
103# interpolated between 800 MW and 1000 MW. We can show the losses are consistent
104# with this approach:
105#
106# Losses at a flow of 800 MW
107print(loss_functions['loss_function'].iloc[0](800))
108# 107.0464
109# Losses at a flow of 1000 MW
110print(loss_functions['loss_function'].iloc[0](1000))
111# 150.835
112# Then interpolating by taking the weighted sum of the two losses based on the
113# relative difference between the actual flow and the interpolation break points:
114# Weighting of 800 MW break point = 1 - ((860.102737 - 800.0)/(1000 - 800))
115# Weighting of 800 MW break point = 0.7
116# Weighting of 1000 MW break point = 1 - ((1000 - 860.102737)/(1000 - 800))
117# Weighting of 1000 MW break point = 0.3
118# Weighed sum of losses = 107.0464 * 0.7 + 150.835 * 0.3 = 120.18298
119#
120# We can also see that the flow and loss results are consistent with the supply
121# equals demand constraint, all demand in the VIC region is supplied by the
122# interconnector, so the interconnector flow minus the VIC region interconnector
123# losses should equal the VIC region demand. Note that the VIC region loss
124# share is 50%:
125# VIC region demand = interconnector flow - losses * VIC region loss share
126# 800 = 860.102737 - 120.205473 * 0.5
127# 800 = 800
128
129# Return the total dispatch of each unit in MW.
130print(market.get_unit_dispatch())
131#   unit service    dispatch
132# 0    A  energy  920.205473
133
134# Understanding the dispatch results: Unit A is the only generator and it must
135# be dispatch to meet demand plus losses:
136# dispatch = VIC region demand + NSW region demand + losses
137# dispatch = 920.205473
138
139# Return the price of energy in each region.
140print(market.get_energy_prices())
141#   region      price
142# 0    NSW  50.000000
143# 1    VIC  62.292869
144
145# Understanding the pricing results: Pricing in the NSW region is simply the
146# marginal cost of supply from unit A. The marginal cost of supply in the
147# VIC region is the cost of unit A to meet both marginal demand and the
148# marginal losses on the interconnector.

5. Simple FCAS markets

This example implements a market for energy, regulation raise and contingency 6 sec raise, with co-optimisation constraints as described in section 6.2 and 6.3 of FCAS Model in NEMDE.

  1import pandas as pd
  2from nempy import markets
  3
  4
  5# Set options so you see all DataFrame columns in print outs.
  6pd.options.display.width = 0
  7
  8# Volume of each bid.
  9volume_bids = pd.DataFrame({
 10    'unit': ['A', 'A', 'B', 'B', 'B'],
 11    'service': ['energy', 'raise_6s', 'energy',
 12                'raise_6s', 'raise_reg'],
 13    '1': [100.0, 10.0, 110.0, 15.0, 15.0],  # MW
 14})
 15
 16print(volume_bids)
 17#   unit    service      1
 18# 0    A     energy  100.0
 19# 1    A   raise_6s   10.0
 20# 2    B     energy  110.0
 21# 3    B   raise_6s   15.0
 22# 4    B  raise_reg   15.0
 23
 24# Price of each bid.
 25price_bids = pd.DataFrame({
 26    'unit': ['A', 'A', 'B', 'B', 'B'],
 27    'service': ['energy', 'raise_6s', 'energy',
 28                'raise_6s', 'raise_reg'],
 29    '1': [50.0, 35.0, 60.0, 20.0, 30.0],  # $/MW
 30})
 31
 32print(price_bids)
 33#   unit    service     1
 34# 0    A     energy  50.0
 35# 1    A   raise_6s  35.0
 36# 2    B     energy  60.0
 37# 3    B   raise_6s  20.0
 38# 4    B  raise_reg  30.0
 39
 40# Participant defined operational constraints on FCAS enablement.
 41fcas_trapeziums = pd.DataFrame({
 42    'unit': ['B', 'B', 'A'],
 43    'service': ['raise_reg', 'raise_6s', 'raise_6s'],
 44    'max_availability': [15.0, 15.0, 10.0],
 45    'enablement_min': [50.0, 50.0, 70.0],
 46    'low_break_point': [65.0, 65.0, 80.0],
 47    'high_break_point': [95.0, 95.0, 100.0],
 48    'enablement_max': [110.0, 110.0, 110.0]
 49})
 50
 51print(fcas_trapeziums)
 52#   unit    service  max_availability  enablement_min  low_break_point  high_break_point  enablement_max
 53# 0    B  raise_reg              15.0            50.0             65.0              95.0           110.0
 54# 1    B   raise_6s              15.0            50.0             65.0              95.0           110.0
 55# 2    A   raise_6s              10.0            70.0             80.0             100.0           110.0
 56
 57# Unit locations.
 58unit_info = pd.DataFrame({
 59    'unit': ['A', 'B'],
 60    'region': ['NSW', 'NSW']
 61})
 62
 63print(unit_info)
 64#   unit region
 65# 0    A    NSW
 66# 1    B    NSW
 67
 68# The demand in the region\s being dispatched.
 69demand = pd.DataFrame({
 70    'region': ['NSW'],
 71    'demand': [195.0]  # MW
 72})
 73
 74print(demand)
 75#   region  demand
 76# 0    NSW   195.0
 77
 78# FCAS requirement in the region\s being dispatched.
 79fcas_requirements = pd.DataFrame({
 80    'set': ['nsw_regulation_requirement', 'nsw_raise_6s_requirement'],
 81    'region': ['NSW', 'NSW'],
 82    'service': ['raise_reg', 'raise_6s'],
 83    'volume': [10.0, 10.0]  # MW
 84})
 85
 86print(fcas_requirements)
 87#                           set region    service  volume
 88# 0  nsw_regulation_requirement    NSW  raise_reg    10.0
 89# 1    nsw_raise_6s_requirement    NSW   raise_6s    10.0
 90
 91# Create the market model with unit service bids.
 92market = markets.SpotMarket(unit_info=unit_info,
 93                            market_regions=['NSW'])
 94market.set_unit_volume_bids(volume_bids)
 95market.set_unit_price_bids(price_bids)
 96
 97# Create constraints that enforce the top of the FCAS trapezium.
 98fcas_availability = fcas_trapeziums.loc[:, ['unit', 'service', 'max_availability']]
 99market.set_fcas_max_availability(fcas_availability)
100
101# Create constraints that enforce the lower and upper slope of the FCAS regulation
102# service trapeziums.
103regulation_trapeziums = fcas_trapeziums[fcas_trapeziums['service'] == 'raise_reg']
104market.set_energy_and_regulation_capacity_constraints(regulation_trapeziums)
105
106# Create constraints that enforce the lower and upper slope of the FCAS contingency
107# trapezium. These constrains also scale slopes of the trapezium to ensure the
108# co-dispatch of contingency and regulation services is technically feasible.
109contingency_trapeziums = fcas_trapeziums[fcas_trapeziums['service'] == 'raise_6s']
110market.set_joint_capacity_constraints(contingency_trapeziums)
111
112# Set the demand for energy.
113market.set_demand_constraints(demand)
114
115# Set the required volume of FCAS services.
116market.set_fcas_requirements_constraints(fcas_requirements)
117
118# Calculate dispatch and pricing
119market.dispatch()
120
121# Return the total dispatch of each unit in MW.
122print(market.get_unit_dispatch())
123#   unit    service  dispatch
124# 0    A     energy     100.0
125# 1    A   raise_6s       5.0
126# 2    B     energy      95.0
127# 3    B   raise_6s       5.0
128# 4    B  raise_reg      10.0
129
130# Understanding the dispatch results: Starting with the raise regulation
131# service we can see that only unit B has bid to provide this service so
132# 10 MW of it's raise regulation bid must be dispatch. For the raise 6 s
133# service while unit B is cheaper it's provision of 10 MW of raise
134# regulation means it can only provide 5 MW of raise 6 s, so 5 MW must be
135# provided by unit A. For the energy service unit A is cheaper so all
136# 100 MW of it's energy bid are dispatched, leaving the remaining 95 MW to
137# provided by unit B. Also, note that these energy and FCAS dispatch levels are
138# permitted by the FCAS trapezium constraints. Further explanation of these
139# constraints are provided here: docs/pdfs/FCAS Model in NEMDE.pdf.
140
141# Return the price of energy.
142print(market.get_energy_prices())
143#   region  price
144# 0    NSW   75.0
145
146#  Understanding energy price results:
147#  A marginal unit of energy would have to come from unit B, as unit A is fully
148#  dispatch, this would cost 60 $/MW/h. However, to turn unit B up, you would
149#  need it to dispatch less raise_6s, this would cost - 20 $/MW/h, and the
150#  extra FCAS would have to come from unit A, this would cost 35 $/MW/h.
151#  Therefore the marginal cost of energy is 60 - 20 + 35 = 75 $/MW/h
152
153# Return the price of regulation FCAS.
154print(market.get_fcas_prices())
155#   region    service  price
156# 0    NSW   raise_6s   35.0
157# 1    NSW  raise_reg   45.0
158
159# Understanding FCAS price results:
160# A marginal unit of raise_reg would have to come from unit B as it is the only
161# provider, this would cost 30 $/MW/h. It would also require unit B to provide
162# less raise_6s, this would cost -20 $/MW/h, extra raise_6s would then be
163# required from unit A costing 35 $/MW/h. This gives a total marginal cost of
164# 30 - 20 + 35 = 45 $/MW/h.
165#
166# A marginal unit of raise_6s would be provided by unit A at a cost of 35$/MW/h/.

6. Simple recreation of historical dispatch

Demonstrates using Nempy to recreate historical dispatch intervals by implementing a simple energy market with unit bids, unit maximum capacity constraints and interconnector models, all sourced from historical data published by AEMO.

_images/energy_market_only_qld_prices.png

Results from example: for the QLD region a reasonable fit between modelled prices and historical prices is obtained.

Warning

Warning this script downloads approximately 8.5 GB of data from AEMO. The download_inputs flag can be set to false to stop the script re-downloading data for subsequent runs.

Note

This example also requires plotly >= 5.3.1, < 6.0.0 and kaleido == 0.2.1. Run pip install plotly==5.3.1 and pip install kaleido==0.2.1

  1# Notice:
  2# - This script downloads large volumes of historical market data from AEMO's nemweb
  3#   portal. The boolean on line 20 can be changed to prevent this happening repeatedly
  4#   once the data has been downloaded.
  5# - This example also requires plotly >= 5.3.1, < 6.0.0 and kaleido == 0.2.1
  6#   pip install plotly==5.3.1 and pip install kaleido==0.2.1
  7
  8import sqlite3
  9import pandas as pd
 10import plotly.graph_objects as go
 11from nempy import markets
 12from nempy.historical_inputs import loaders, mms_db, \
 13    xml_cache, units, demand, interconnectors
 14
 15con = sqlite3.connect('historical_mms.db')
 16mms_db_manager = mms_db.DBManager(connection=con)
 17
 18xml_cache_manager = xml_cache.XMLCacheManager('nemde_cache')
 19
 20# The second time this example is run on a machine this flag can
 21# be set to false to save downloading the data again.
 22download_inputs = True
 23
 24if download_inputs:
 25    # This requires approximately 5 GB of storage.
 26    mms_db_manager.populate(start_year=2019, start_month=1,
 27                            end_year=2019, end_month=1)
 28
 29    # This requires approximately 3.5 GB of storage.
 30    xml_cache_manager.populate_by_day(start_year=2019, start_month=1, start_day=1,
 31                                      end_year=2019, end_month=1, end_day=1)
 32
 33raw_inputs_loader = loaders.RawInputsLoader(
 34    nemde_xml_cache_manager=xml_cache_manager,
 35    market_management_system_database=mms_db_manager)
 36
 37# A list of intervals we want to recreate historical dispatch for.
 38dispatch_intervals = ['2019/01/01 12:00:00',
 39                      '2019/01/01 12:05:00',
 40                      '2019/01/01 12:10:00',
 41                      '2019/01/01 12:15:00',
 42                      '2019/01/01 12:20:00',
 43                      '2019/01/01 12:25:00',
 44                      '2019/01/01 12:30:00']
 45
 46# List for saving outputs to.
 47outputs = []
 48
 49# Create and dispatch the spot market for each dispatch interval.
 50for interval in dispatch_intervals:
 51    raw_inputs_loader.set_interval(interval)
 52    unit_inputs = units.UnitData(raw_inputs_loader)
 53    demand_inputs = demand.DemandData(raw_inputs_loader)
 54    interconnector_inputs = \
 55        interconnectors.InterconnectorData(raw_inputs_loader)
 56
 57    unit_info = unit_inputs.get_unit_info()
 58    market = markets.SpotMarket(market_regions=['QLD1', 'NSW1', 'VIC1',
 59                                                'SA1', 'TAS1'],
 60                                unit_info=unit_info)
 61
 62    volume_bids, price_bids = unit_inputs.get_processed_bids()
 63    market.set_unit_volume_bids(volume_bids)
 64    market.set_unit_price_bids(price_bids)
 65
 66    unit_bid_limit = unit_inputs.get_unit_bid_availability()
 67    market.set_unit_bid_capacity_constraints(unit_bid_limit)
 68
 69    unit_uigf_limit = unit_inputs.get_unit_uigf_limits()
 70    market.set_unconstrained_intermitent_generation_forecast_constraint(
 71        unit_uigf_limit)
 72
 73    regional_demand = demand_inputs.get_operational_demand()
 74    market.set_demand_constraints(regional_demand)
 75
 76    interconnectors_definitions = \
 77        interconnector_inputs.get_interconnector_definitions()
 78    loss_functions, interpolation_break_points = \
 79        interconnector_inputs.get_interconnector_loss_model()
 80    market.set_interconnectors(interconnectors_definitions)
 81    market.set_interconnector_losses(loss_functions,
 82                                     interpolation_break_points)
 83    market.dispatch()
 84
 85    # Save prices from this interval
 86    prices = market.get_energy_prices()
 87    prices['time'] = interval
 88
 89    # Getting historical prices for comparison. Note, ROP price, which is
 90    # the regional reference node price before the application of any
 91    # price scaling by AEMO, is used for comparison.
 92    historical_prices = mms_db_manager.DISPATCHPRICE.get_data(interval)
 93
 94    prices = pd.merge(prices, historical_prices,
 95                      left_on=['time', 'region'],
 96                      right_on=['SETTLEMENTDATE', 'REGIONID'])
 97
 98    outputs.append(
 99        prices.loc[:, ['time', 'region', 'price', 'ROP']])
100
101con.close()
102
103outputs = pd.concat(outputs)
104
105# Plot results for QLD market region.
106qld_prices = outputs[outputs['region'] == 'QLD1']
107
108fig = go.Figure()
109fig.add_trace(go.Scatter(x=qld_prices['time'], y=qld_prices['price'], name='Nempy price', mode='markers',
110                         marker_size=12, marker_symbol='circle'))
111fig.add_trace(go.Scatter(x=qld_prices['time'], y=qld_prices['ROP'], name='Historical price', mode='markers',
112                         marker_size=8))
113fig.update_xaxes(title="Time")
114fig.update_yaxes(title="Price ($/MWh)")
115fig.update_layout(yaxis_range=[0.0, 100.0], title="QLD Region Price")
116fig.write_image('energy_market_only_qld_prices.png')
117fig.show()
118
119print(outputs)
120#                   time region      price       ROP
121# 0  2019/01/01 12:00:00   NSW1  91.857666  91.87000
122# 1  2019/01/01 12:00:00   QLD1  76.180429  76.19066
123# 2  2019/01/01 12:00:00    SA1  85.126914  86.89938
124# 3  2019/01/01 12:00:00   TAS1  85.948523  89.70523
125# 4  2019/01/01 12:00:00   VIC1  83.250703  84.98410
126# 0  2019/01/01 12:05:00   NSW1  88.357224  91.87000
127# 1  2019/01/01 12:05:00   QLD1  72.255334  64.99000
128# 2  2019/01/01 12:05:00    SA1  82.417720  87.46213
129# 3  2019/01/01 12:05:00   TAS1  83.451561  90.08096
130# 4  2019/01/01 12:05:00   VIC1  80.621103  85.55555
131# 0  2019/01/01 12:10:00   NSW1  91.857666  91.87000
132# 1  2019/01/01 12:10:00   QLD1  75.665675  64.99000
133# 2  2019/01/01 12:10:00    SA1  85.680310  86.86809
134# 3  2019/01/01 12:10:00   TAS1  86.715499  89.87995
135# 4  2019/01/01 12:10:00   VIC1  83.774337  84.93569
136# 0  2019/01/01 12:15:00   NSW1  88.343034  91.87000
137# 1  2019/01/01 12:15:00   QLD1  71.746786  64.78003
138# 2  2019/01/01 12:15:00    SA1  82.379539  86.84407
139# 3  2019/01/01 12:15:00   TAS1  83.451561  89.48585
140# 4  2019/01/01 12:15:00   VIC1  80.621103  84.99034
141# 0  2019/01/01 12:20:00   NSW1  91.864122  91.87000
142# 1  2019/01/01 12:20:00   QLD1  75.052319  64.78003
143# 2  2019/01/01 12:20:00    SA1  85.722028  87.49564
144# 3  2019/01/01 12:20:00   TAS1  86.576848  90.28958
145# 4  2019/01/01 12:20:00   VIC1  83.859306  85.59438
146# 0  2019/01/01 12:25:00   NSW1  91.864122  91.87000
147# 1  2019/01/01 12:25:00   QLD1  75.696247  64.99000
148# 2  2019/01/01 12:25:00    SA1  85.746024  87.51983
149# 3  2019/01/01 12:25:00   TAS1  86.613642  90.38750
150# 4  2019/01/01 12:25:00   VIC1  83.894945  85.63046
151# 0  2019/01/01 12:30:00   NSW1  91.870167  91.87000
152# 1  2019/01/01 12:30:00   QLD1  75.188735  64.99000
153# 2  2019/01/01 12:30:00    SA1  85.694071  87.46153
154# 3  2019/01/01 12:30:00   TAS1  86.560602  90.09919
155# 4  2019/01/01 12:30:00   VIC1  83.843570  85.57286

8. Time sequential recreation of historical dispatch

This example demonstrates using Nempy to recreate historical dispatch in a dynamic or time sequential manner, this means the outputs of one interval become the initial conditions for the next dispatch interval. Note, currently there is not the infrastructure in place to include features such as generic constraints in the time sequential model as the rhs values of many constraints would need to be re-calculated based on the dynamic system state. Similarly, using historical bids in this example is some what problematic as participants also dynamically change their bids based on market conditions. However, for the sake of demonstrating how Nempy can be used to create time sequential models, historical bids are used in this example.

Warning

Warning this script downloads approximately 8.5 GB of data from AEMO. The download_inputs flag can be set to false to stop the script re-downloading data for subsequent runs.

  1# Notice:
  2# - This script downloads large volumes of historical market data from AEMO's nemweb
  3#   portal. The boolean on line 20 can be changed to prevent this happening repeatedly
  4#   once the data has been downloaded.
  5# - This example also requires plotly >= 5.3.1, < 6.0.0 and kaleido == 0.2.1
  6#   pip install plotly==5.3.1 and pip install kaleido==0.2.1
  7
  8import sqlite3
  9import pandas as pd
 10from nempy import markets, time_sequential
 11from nempy.historical_inputs import loaders, mms_db, \
 12    xml_cache, units, demand, interconnectors, constraints
 13
 14con = sqlite3.connect('market_management_system.db')
 15mms_db_manager = mms_db.DBManager(connection=con)
 16
 17xml_cache_manager = xml_cache.XMLCacheManager('cache_directory')
 18
 19# The second time this example is run on a machine this flag can
 20# be set to false to save downloading the data again.
 21download_inputs = True
 22
 23if download_inputs:
 24    # This requires approximately 5 GB of storage.
 25    mms_db_manager.populate(start_year=2019, start_month=1,
 26                            end_year=2019, end_month=1)
 27
 28    # This requires approximately 3.5 GB of storage.
 29    xml_cache_manager.populate_by_day(start_year=2019, start_month=1, start_day=1,
 30                                      end_year=2019, end_month=1, end_day=1)
 31
 32raw_inputs_loader = loaders.RawInputsLoader(
 33    nemde_xml_cache_manager=xml_cache_manager,
 34    market_management_system_database=mms_db_manager)
 35
 36# A list of intervals we want to recreate historical dispatch for.
 37dispatch_intervals = ['2019/01/01 12:00:00',
 38                      '2019/01/01 12:05:00',
 39                      '2019/01/01 12:10:00',
 40                      '2019/01/01 12:15:00',
 41                      '2019/01/01 12:20:00',
 42                      '2019/01/01 12:25:00',
 43                      '2019/01/01 12:30:00']
 44
 45# List for saving outputs to.
 46outputs = []
 47
 48unit_dispatch = None
 49from time import time
 50
 51t0 = time()
 52# Create and dispatch the spot market for each dispatch interval.
 53for interval in dispatch_intervals:
 54    print(interval)
 55    raw_inputs_loader.set_interval(interval)
 56    unit_inputs = units.UnitData(raw_inputs_loader)
 57    demand_inputs = demand.DemandData(raw_inputs_loader)
 58    interconnector_inputs = \
 59        interconnectors.InterconnectorData(raw_inputs_loader)
 60    constraint_inputs = \
 61        constraints.ConstraintData(raw_inputs_loader)
 62
 63    unit_info = unit_inputs.get_unit_info()
 64    market = markets.SpotMarket(market_regions=['QLD1', 'NSW1', 'VIC1',
 65                                                'SA1', 'TAS1'],
 66                                unit_info=unit_info)
 67
 68    volume_bids, price_bids = unit_inputs.get_processed_bids()
 69    market.set_unit_volume_bids(volume_bids)
 70    market.set_unit_price_bids(price_bids)
 71
 72    violation_cost = \
 73        constraint_inputs.get_constraint_violation_prices()['unit_capacity']
 74    unit_bid_limit = unit_inputs.get_unit_bid_availability()
 75    market.set_unit_bid_capacity_constraints(unit_bid_limit)
 76    market.make_constraints_elastic('unit_bid_capacity', violation_cost)
 77
 78    unit_uigf_limit = unit_inputs.get_unit_uigf_limits()
 79    market.set_unconstrained_intermitent_generation_forecast_constraint(
 80        unit_uigf_limit)
 81
 82    ramp_rates = unit_inputs.get_as_bid_ramp_rates()
 83
 84    # This is the part that makes it time sequential.
 85    if unit_dispatch is None:
 86        # For the first dispatch interval we use historical values
 87        # as initial conditions.
 88        historical_dispatch = unit_inputs.get_initial_unit_output()
 89        ramp_rates = time_sequential.create_seed_ramp_rate_parameters(
 90            historical_dispatch, ramp_rates)
 91    else:
 92        # For subsequent dispatch intervals we use the output levels
 93        # determined by the last dispatch as the new initial conditions
 94        ramp_rates = time_sequential.construct_ramp_rate_parameters(
 95            unit_dispatch, ramp_rates)
 96
 97    violation_cost = \
 98        constraint_inputs.get_constraint_violation_prices()['ramp_rate']
 99    market.set_unit_ramp_up_constraints(
100        ramp_rates.loc[:, ['unit', 'initial_output', 'ramp_up_rate']])
101    market.make_constraints_elastic('ramp_up', violation_cost)
102    market.set_unit_ramp_down_constraints(
103        ramp_rates.loc[:, ['unit', 'initial_output', 'ramp_down_rate']])
104    market.make_constraints_elastic('ramp_down', violation_cost)
105
106    regional_demand = demand_inputs.get_operational_demand()
107    market.set_demand_constraints(regional_demand)
108
109    interconnectors_definitions = \
110        interconnector_inputs.get_interconnector_definitions()
111    loss_functions, interpolation_break_points = \
112        interconnector_inputs.get_interconnector_loss_model()
113    market.set_interconnectors(interconnectors_definitions)
114    market.set_interconnector_losses(loss_functions,
115                                     interpolation_break_points)
116    market.dispatch()
117
118    # Save prices from this interval
119    prices = market.get_energy_prices()
120    prices['time'] = interval
121    outputs.append(prices.loc[:, ['time', 'region', 'price']])
122
123    unit_dispatch = market.get_unit_dispatch()
124print("Run time per interval {}".format((time()-t0)/len(dispatch_intervals)))
125con.close()
126print(pd.concat(outputs))
127#                   time region      price
128# 0  2019/01/01 12:00:00   NSW1  91.857666
129# 1  2019/01/01 12:00:00   QLD1  76.716642
130# 2  2019/01/01 12:00:00    SA1  85.126914
131# 3  2019/01/01 12:00:00   TAS1  86.173481
132# 4  2019/01/01 12:00:00   VIC1  83.250703
133# 0  2019/01/01 12:05:00   NSW1  88.357224
134# 1  2019/01/01 12:05:00   QLD1  72.255334
135# 2  2019/01/01 12:05:00    SA1  82.417720
136# 3  2019/01/01 12:05:00   TAS1  83.451561
137# 4  2019/01/01 12:05:00   VIC1  80.621103
138# 0  2019/01/01 12:10:00   NSW1  91.857666
139# 1  2019/01/01 12:10:00   QLD1  75.665675
140# 2  2019/01/01 12:10:00    SA1  85.680310
141# 3  2019/01/01 12:10:00   TAS1  86.715499
142# 4  2019/01/01 12:10:00   VIC1  83.774337
143# 0  2019/01/01 12:15:00   NSW1  88.113012
144# 1  2019/01/01 12:15:00   QLD1  71.559977
145# 2  2019/01/01 12:15:00    SA1  82.165045
146# 3  2019/01/01 12:15:00   TAS1  83.451561
147# 4  2019/01/01 12:15:00   VIC1  80.411187
148# 0  2019/01/01 12:20:00   NSW1  91.864122
149# 1  2019/01/01 12:20:00   QLD1  75.052319
150# 2  2019/01/01 12:20:00    SA1  85.722028
151# 3  2019/01/01 12:20:00   TAS1  86.576848
152# 4  2019/01/01 12:20:00   VIC1  83.859306
153# 0  2019/01/01 12:25:00   NSW1  91.864122
154# 1  2019/01/01 12:25:00   QLD1  75.696247
155# 2  2019/01/01 12:25:00    SA1  85.746024
156# 3  2019/01/01 12:25:00   TAS1  86.613642
157# 4  2019/01/01 12:25:00   VIC1  83.894945
158# 0  2019/01/01 12:30:00   NSW1  91.870167
159# 1  2019/01/01 12:30:00   QLD1  75.188735
160# 2  2019/01/01 12:30:00    SA1  85.694071
161# 3  2019/01/01 12:30:00   TAS1  86.560602
162# 4  2019/01/01 12:30:00   VIC1  83.843570