Power Purchase Agreements (PPAs) are a key financial instrument for opening up access to renewable electricity for large commercial users - providing certainty for both the consumer and supplier on prices and volumes up to decades in advance. Additionally, PPAs provide hedging for customers with critical processes, such as heavy industry, as well as lock in demand for new renewable power projects at the inception stage.
However, at their core they are highly complex legal agreements that can run into the hundreds of pages and cost hundreds of thousands of pounds in fees to officiate. This makes them attractive only when millions of pounds of energy will change hands, effectively locking out smaller consumers and suppliers from access to renewable power. As the cost of electricity from wind and solar plants tumbles, PPAs are becoming more and more popular for commercial customers that do not necessarily trust the day-ahead market to get the best deals. A critical obstacle to empowering these smaller consumers in transitioning to a fully renewable supply of electricity via a PPA is reaching the volumes of demand needed to make the legal costs viable.
A variety of instruments have been created to try and address this problem, the most promising of which is the aggregated PPA. In this arrangement, a group of smaller consumers band together and agree on a fixed price and set of terms, presenting them to a supplier as if they were a single PPA. Together, their combined demand volume is equivalent to a single large customer. Whilst this is a logical and financially sound approach to overcoming the previous concerns, it opens up a Pandora’s box of fresh challenges - namely, how do we construct an aggregate PPA such that the risk profiles, risk tolerances, legal budgets, and a host of other company preferences can be combined and measured in such a way that the deal is still attractive to the supplier? Smaller consumers also require flexibility, and may not be able to commit to 5-year contracts, let alone 20-year contracts - how can the price and default risk of individual sites be fairly distributed among all parties in an aggregated PPA?
This is the goal of Zeigo, a London-based startup aiming to democratise access to clean energy for corporates committed to decarbonisation. Through data-driven recommendation systems and novel characterisation methods, Zeigo have developed a platform that matches consumers and suppliers at either end of the PPA syndication process - providing transparency and a level playing field for smaller consumers to band together into aggregate groups. To engage the energy data community and help generate novel ideas to tackle these challenges, Zeigo launched the EcoHack 2020 hackathon on the 9th of October 2020, held virtually across the day with four teams from a wide range of backgrounds. Our mission was broad - to devise solutions to break down the barriers between smaller consumers and PPAs.
Representing Site-Level PPAs by Term Length and Credit Default Risk
In our winning solution we developed a brand new method for dynamically pricing power for each constituent member of an aggregated PPA, based on its credit default risk, requested annual demand, and length of contract. The price for existing members of the PPA will never change once they’re in, but dynamically pricing new entrants in this way ensures that the additional risk borne by the aggregate PPA as a whole is marginally allocated to the incoming party. This also ensures that the price and default risk seen by the supplier does not change - this is handled by calculating a dynamic exit cost for each buyer to leave the pool that is a function of the equivalent day-ahead price they would have paid had they satisfied their demand from the market.
1000 Monte Carlo Default Simulations Using Zeigo's PPA Data
For the sample of PPAs provided to us by Zeigo, we created a Monte Carlo simulation of default rates for each of the sites, based on their demand volume and credit rating. From this, we took a number of quantiles and tested which would give the best indicator of individual site default risk. These were then allocated to each site proportional to their demand volume, and used to derive their dynamic and personalised price.
Furthermore, in addition to the default risk we estimated the balancing costs associated with each PPA pool, using production data from existing UK wind farms and example consumption profiles for each demand site. Doing so allowed us to quantify the volume matching cost for each consumer, enabling tailored PPA prices that internalise these externalities. The final prices reflected the capture price that the supplier could have achieved in the wholesale market, the default risk of the buyer, the volume matching cost of the buyer, and an uplift to account for additional risks and Zeigo platform costs. An example pool of resulting prices is illustrated below.
Dynamic Prices Allocated to Each Site, Derived From Default Risk, Balancing Exposure, Horizon, and Demand Volume.
Additionally, we presented a prototype concept for a secondary market that would allow third-party financial institutions to purchase swaps that would effectively exchange one consumer’s PPA arrangements for another’s in a different aggregate PPA, taking a cut of the difference in price between them. This not only introduces liquidity to an otherwise highly illiquid marketplace, but also gives consumers the flexibility to ‘move’ from one PPA to another without bearing any legal costs.
We will be presenting these ideas alongside Zeigo in the near future to the RE100, the largest group of UK companies committed to decarbonisation, and hope to generate some interest in a very exciting but underserved area of renewable energy innovation.
More information on Zeigo and EcoHack 2020 can be found here.
If you want to find out more about how increasing renewables and PPAs are changing the energy system get in touch using firstname.lastname@example.org
At 4am on Saturday 22nd February the UK set a new record for the maximum percentage of wind on its power grid at 57.2%. This was enabled through strong and gusty weather from the tail-end of storm Dennis as well as by low overnight demand, the energy mix over the record setting period can be seen in the graph below. Whilst this weekend saw a record for percentage of demand met by wind, the absolute power that wind has generated reached a maximum of 17 GW a couple of weeks previously. This took place on the 10th February, during which the national wind fleet reached an instantaneous load factor of 76.8% over midday.
Figure 1) Fuel mix during record setting period
There are a number of factors for the long-term rising record levels, shown below. The most prominent being the ever-increasing wind capacity on the system - totalling 22.1 GW at time of writing - including 0.28 GW recently added by East Anglia One Phase 2 at the end of 2019. The more recent wind farm developments have also enabled higher load factors due to improvements in both turbine technology and the higher wind resource available where they’re being sited. At the same time overall demand has continued to fall, with only 285 TWh of power consumed in 2019 - down 12% from a decade earlier - which makes it easier for wind to make up a larger percentage of total generation.
Figure 2) Rising wind penetration record levels
Most importantly, the rising wind levels are helping to further decarbonise the UK’s electricity system. The National Grid recently reported that this January had the lowest average power carbon intensity on record at 209 g/kWh, a remarkable feat considering that a decade ago this figure stood at 519 g/kWh. One of the main drivers for this has been wind’s displacement of coal off the system. In 2010 coal averaged 30% of demand, but so far in 2020 it has provided less than 5%. This trend is set to continue as greater renewable capacity leads to increased cycling for thermal plants and further erodes their profits.
Figure 3) Falling UK power carbon intensity
As new wind penetration records continue to be set every year, so too are new records for balancing service costs and curtailment set. Wind is an intermittent resource with higher speeds found over the border in Scotland, while the largest demand sinks are clustered in the South. This disparity in location highlights the weaknesses in the UK’s existing transmission infrastructure, in particular the two lone transmission lines connecting England to Scotland. This creates problems for the National Grid when there is considerable wind penetration - occuring far more frequently as shown below - and low demand, wind generators must then turned off to avoid system capacities being overloaded.
Figure 4) Rising average daily wind penetration
From December 2018 the 2.2GW Western Link HVDC connection was intended to provide a third pathway for wind power from Scotland to be transported down to demand in the south, as well as to alleviate congestion during high wind periods. Unfortunately though it has been plagued with schedule overruns and unexpected faults. This sporadic operation does, however, give us insight into the effects of additional transmission capacity on Scottish wind curtailment and the UK grid’s balancing costs, something that will be explored further in a future blog post.
If you want to find out more about how increased renewables are changing the energy system get in touch using email@example.com
Unlike thermal generation, such as coal or gas, renewables have no fuel costs. This means that when generators are bidding to sell their power, renewables can do so at a much lower price than almost any other fuel type. In turn this displaces more expensive thermal generation and means that the overall price at which the market clears ends up being lower than it would have without renewables, this is often coined the Merit Order Effect (MOE) and is shown visually in figure 1a.
Figure 1a) Reduction in market clearing price with increased renewable generation. b) Fitted supply curve from observed price and fuel generation.
FEA has carried out analysis whereby the supply stack of non-renewable generation has been approximated for every settlement period within the day-ahead market (example fit shown in figure 1b), we have then used this to calculate the price reduction caused by renewables for each half-hour over the last decade. The results can be seen in figure 2 and show that the MOE has grown with renewable capacity, with the long-term trend averaging £9/MWh at the start of 2020.
Figure 2) Rising MOE over the 2010’s.
For consumers on variable tariffs this is great news - they can maximise their consumption when electricity is both green and cheap - for the majority though the benefits are more nuanced. Because thermal generators are being displaced from the supply stack they try and raise revenues through other methods, primarily by increasing the price that they will supply at when there is no choice but to source power from them (high demand/low renewable periods). For these reasons it is not possible to directly translate the MOE to medium/long-term savings, instead it should be viewed only as an effect seen within individual electricity trading settlement periods.
Renewable generators though should be aware of the MOE for less positive reasons as it represents a reduction on their profits, the higher penetration they reach on the grid the lower their revenues become. This is often referred to as price cannibilisation and is worse for renewable generators whose generation is highly correlated with the national fleet.
Figure 3) Falling clearing prices for CfD auctions.
It is in part for this reason that the new Dogger Bank wind farm developments (3.6 GW total capacity) are sited almost 300km from the coast, their large distance from other farms helps mitigate against price cannibilisation. The threat of lower revenues relative to receiving the average market price also helps explain why the latest Contracts for Difference (CfD) auctions cleared so low (figure 3), as it guarantees a fixed price they will receive which won’t be subject to any cannibilisation risks. For the government though this may lead to higher subsidy pay-outs than were previously expected.
If you want to find out more about price cannibilisation get in touch using firstname.lastname@example.org
With the ever increasing share of wind generation on the GB system, National Grid ESO and power generators face an increasingly difficult challenge: How to integrate wind capacity to its full potential. Knowing when wind assets can be dispatched after curtailment or when they can provide frequency response is challenging if you don’t have an accurate picture of a sites instantaneous potential. In October 2019 the Offshore Renewables Catapult, in collaboration with Scottish Power and National Grid, ran an open competition to address precisely this challenge.
The aim was to bring together teams from the private sector and academia with experience in data science and artificial intelligence to create higher accuracy predictions of power available using SCADA data and state of the art machine learning approaches. Teams were provided with six months of data for a Scottish Power onshore windfarm of 16 turbines (37 MW) and given 36 hours to build and test models before being given the evaluation data. Achieving the highest accuracy out of all the teams (0.55 Mean Absolute Error), Future Energy Associates (FEA) were delighted to win the competition and take home the £10,000 prize.
Early research identified the location of the site and subsequently the technical specifications of the turbines (including the manufacturer, generation capacities and power curves), as well as geodata for the site. With limited SCADA data available (wind speed, generator speed, blade angle) additional measures of apparent power and wind direction were generated to add to the models. From the irregularly sampled SCADA data we sought to keep the highest resolution possible - analysis of frequency suggested a 10 second resampling rate would retain the most information whilst avoiding the need for too much interpolation.
Figure 1: Modelled wind direction for sample period.
After exploratory analysis of the data, a combined bottom-up (turbine level) and a top-down (site level) approach was tested. The site level approach explored autoregressive moving average models (ARIMAX) and random forest algorithm based on site level SCADA data and apparent power, whilst the bottom-up approach tested neural networks, gaussian processes and gradient boosted random forests. In order to speed up the data processing an additional curtailment classifier was created using kernel density estimation, this was then added as a feature into the site level model.
Figure 2: Theoretical power versus apparent power for predicted full (yellow), partial (red) and non-curtailment (blue) periods.
With many of the entrants employing similar classes of machine learning models, particularly neural networks and random forests, there are some more subtle differences in the FEA approach that gave our improved accuracy. Firstly, with only 30 minutes allowed to evaluate the final models, a key element of the approach was having a robust pipeline to clean and process the data to produce final predictions. Using a higher resolution of sampling in this process was likely a strong contributor to the success of the models as they had greater training data. The use of additional features is also likely to have improved the FEA model, particularly apparent power and the curtailment classifier which acted as a check on the other model input. We also developed a proxy for wind direction which was novel and attracted the attention of the judging panel.
For more information about power available nowcasting get in touch at email@example.com