EMFC Addresses Head-on the Tectonic Industry Changes

With record attendance in one of the most iconic tourist destinations in the world, the 20th Annual Electric Market Forecasting Conference (EMFC) took place September 6-8 in Las Vegas, NV. This industry-leading conference assembled top-notch speakers and gave an exclusive networking experience to attendees from start to finish.

The pre-conference day featured in-depth sessions designed to maximize the value of the Aurora software for its users. Advanced sessions included discussions on resource modeling and improving model productivity, recent database enhancements including the disaggregation of U.S. resources, an update on the nodal capability and data, and other model enhancements.

Pic 1

Michael Soni, Economist, Support | EPIS

Before the afternoon Users’ Group meeting started, EPIS announced that it was dropping “xmp” from the name of its flagship product to purely Aurora, and unveiled a fresh logo. Ben Thompson, CEO of EPIS said, “The new logo reflects our core principles of being solid and dependable, of continuously improving speed and performance, and of our commitment to helping our customers be successful well into this more complex future.”

That evening, attendees kicked-off the main conference with a night under the stars at Eldorado Canyon for drinks, a BBQ dinner and a tour of the Techatticup Mine; the oldest, richest and most famous gold mine in Southern Nevada.

pic5

Eldorado Canyon, Techatticup Mine

On Thursday, thought leaders from across the industry presented various perspectives on the complex implications that recent industry changes will have on grid operations, future planning and investments. The forum session opened with Arne Olson, a partner with E3 Consulting in San Francisco, discussing California’s proposed legislation SB-100, which aimed to mandate that 100% of California’s energy must be met by renewable sources by 2045, along with the bill’s implications for Western power markets and systems. He pointed out that SB-32, last year’s expansion of earlier legislation, which mandates a 40% reduction in GHG emissions (below the 1990 levels by 2030), is actually more binding than SB-100. He explained the economics of negative prices, why solar output will be increasingly curtailed and posited that CAISO’s famous “duck curve” is becoming more an economic issue vs. the reliability issue it was originally intended to illustrate.

Other Thursday morning presentations included “The Rise of Utility-Scale Storage: past, present, and future” by Cody Hill, energy storage manager for IPP LS Power, who outlined the advances in utility-scale lithium ion batteries, and their expected contributions to reserves as well as energy; Masood Parvania, Ph.D., professor of electrical and computer engineering at the University of Utah, who described recent advances in continuous-time operation and pricing models that more accurately capture and compensate for the fast-ramping capability of demand response (DR) and energy storage device; and Mahesh Morjaria, Ph.D., vice president of PV systems for First Solar who discussed innovations in PV solar module technology, plant capabilities and integration with storage.

Pic 3

Masood Parvania, Ph.D., Director – Utah Smart Energy Lab | The University of Utah

The afternoon proceeded with Mark Cook, general manager of Hoover Dam, who gave a fascinating glimpse into the operations and improvements of one of the most iconic sources of hydro power in the country; and concluded with Lee Alter, senior resource planning analyst and policy expert for Tucson Electric Power, who shared some of the challenges and lessons learned in integrating renewables at a mid-sized utility.

Networking continued Thursday afternoon with a few of the unique opportunities Las Vegas offers. In smaller groups attendees were able to better connect with each other while enjoying one of three options which included a delicious foodie tour, swinging clubs at TopGolf, or solving a mystery at the Mob Museum.

The final day of the conference was devoted to giving Aurora clients the opportunity to see how their peers are using the software to solve complex power market issues. It featured practical discussions on how to model battery storage, ancillary services, the integration of renewables and an analysis of the impact of clean energy policies all while using Aurora.

The conference adjourned and attendees headed out for a special tour of the Hoover Dam which included a comprehensive view of the massive dam and its operations, and highlighted many of the unique features around the site.

pic 4

Hoover Dam, Power Plant Tour

The EMFC is a once-a-year opportunity for industry professionals. The 20th Annual EMFC addressed head-on the tectonic industry changes (occurring and expected) from deep renewable penetration, advances in storage technologies, and greater uncertainty. Join EPIS next year for the 21st Annual EMFC!

For more information on the 2017 speakers, please visit http://epis.com/events/2017-emfc/speakers.html
To obtain a copy of any or all of the presentations from this year’s EMFC, Aurora clients can go to EPIS’s Knowledge Base website using their login credentials here. If you do not have login credentials, please email info@epis.com to request copies.

Filed under: Events, UncategorizedTagged with: , , , , ,

How Good is the EIA at Predicting Henry Hub?

Natural gas power plants are a key component of bulk electrical systems in North America. In the U.S., natural gas power plants made up the largest portion of installed capacity, 42%, as of December 2016 and contributed more to generation than any other source. In Mexico, natural gas power plants supplied 54% of the required electricity in 2015 and are a key component of the capacity additions in development of the national electrical system. Natural gas is also likely to be the primary energy source in the U.S. due to increased regulation on coal units, uncertainty around the future of nuclear generation, and low natural gas prices.

Natural gas prices are a critical driver of electricity prices and a key input variable in electric power models. Due to the large amount of natural gas power plants in North America, and because fuel costs are the largest cost component of a thermal power plant, wholesale electricity prices are tightly coupled with natural gas prices. There is also an important feedback loop, in that natural gas demand, and price, is tightly coupled to the operation of natural gas power plants. Understanding the interplay between gas and power markets, and uncertainties in forecasts, is critical for forecasting either.

The U.S. Energy Information Administration (EIA) provides natural gas price short-term forecasts through the Short-Term Energy Outlook (STEO) and long-term forecasts through the Annual Energy Outlook (AEO). For the purposes of this article, we will focus on the STEO. The STEO is a monthly report with, among other items, a natural gas consumption and price forecast for 13 to 24 months in the future depending on the month published. The model predicts consumption and prices for three sectors (commercial, industrial, and residential) in the nine U.S. census districts. To do this, the model calculates natural gas consumption and supply levels to build an inventory. Prices are derived from a regression equation using the inventory and heating and cooling degree days, and analysts then make adjustments for final prices. Detailed information on each equation and method is provided by EIA Natural Gas Consumption and Prices document.

How good is the EIA at forecasting natural gas prices from a month to a year out?

To evaluate the STEO forecasts of natural gas prices, we downloaded each monthly STEO report from January 2012 to December 2016 to allow for at least a full year of analysis with historical prices. This period was selected because it is representative of the current trend of low natural gas prices (relative to historical). The mean absolute error (MAE) and mean absolute percent error (MAPE) were calculated for each forecasted value. Prices were then evaluated for the first forecast in each year and a subset of forecasts from consecutive months during a price spike. The mean absolute percent error was also evaluated for each report year and across all reports.

For the period analyzed (2012 to 2016, shown in orange below), the wholesale Henry Hub gas price averaged $3.30/mmbtu with a high price of $6.19/mmbtu in early 2014 due to the extreme Northeast weather (i.e., the polar vortex) and a low price of $1.78/mmbtu due to warm weather conditions and large amount of storage late in 2016. This period is representative of relatively low natural gas prices as compared to the previous five-year period with high prices exceeding $10/mmbtu driven by high oil prices and an average of $5.63/mmbtu despite the sharp decline due to the financial crisis in 2008-2009.

hh1

Figure 1. Historical Henry Hub natural gas prices. The yellow period denotes the study period used for this analysis. Source: EIA.

We started by looking at the longest-term forecasts (24 months) that are delivered in January of each year, and saw an inability to capture rapid fluctuations in prices in the study period:

hh2

Figure 2. Historical Henry Hub gas prices with 24 month forecasts from the January STEO of each year starting in 2012 and ending in 2015 using the base case data. Source: EIA STEO.

The January 2012 forecast missed the sharp reduction in prices from the winter to summer that were driven by high storage volumes. Less volatility occurred over the first part of the January 2013 forecast, however this forecast missed the large increase in prices to over $6/mmbtu which were driven by extreme weather conditions. The January 2014 forecast also missed the weather-driven high price for this period and then was high-biased in the later months of the forecast. The January 2015 forecast was high-biased the entire forecast period and missed the lower prices which were driven by a combination of mild weather and high storage volumes.
The STEO forecast is very sensitive to the initial conditions or starting month’s price. For example, plotting each month’s forecast during the increase from $3.74/mmbtu in November 2013 to $6.19/mmbtu in February of 2014 shows the impact of the rapid change in initial condition (last known price) on the first month forecasted value:

hh3

Figure 3. Historical Henry Hub gas prices with forecasted values from the months leading up to the rapid price spike in February 2014.

Presumably the long-term fundamental drivers of the STEO do not change as much as the initial conditions, and thus the longer-term forecast is much less sensitive to initial conditions.
Despite missing the fluctuation events, on average across the years analyzed the STEO is within 8% of the price in the first month of the forecast, 25% of the price out to eight months and 33% of the price out to 13 months:

hh4

Figure 4. Mean absolute percent error calculated for each forecasted month of STEO reports. Data are averaged over a report year, as well as over all of the report years. Maximum and minimum percent error is calculated over all STEO reports.

On average, the trend has increasing error with forecast length, however, this does not occur in the 13-month 2012 or 2013 STEOs. The expected error growth with time does appear in the 2014 and 2015 STEOs, reaching nearly 60% in the 2014 STEO. The maximum percent error in any given forecast grows rapidly from 26% in the first forecasted month to 75% in the fourth forecasted month, and reaches a high of over 100% 12 and 13 months out.
In absolute terms, the error ranges on average from $0.25/mmbtu in the first forecasted month to $0.88/mmbtu 13 months out. Maximum and minimum errors range from less than a penny up to $2.45/mmbtu.

hh5

Figure 5. Mean absolute error calculated for each forecasted month of STEO reports. Data are averaged over a report year, as well as over all of the report years. Maximum and minimum absolute error is calculated over all STEO reports.

Is the STEO forecast good enough? Unfortunately, as with many answers, it depends. More importantly, however, is understanding the limitations and uncertainties in their gas forecasts. If relying on EIA forecasts, you must realize the sensitivity to initial conditions and the typical error growth in the first months to year of the forecast. With this information, sensitivity studies can be formulated to capture possible fluctuations in gas prices. Taken together with other uncertainties such as demand, transmission outages, and plant outages, you can begin to form an ensemble of forecasts.

Filed under: Natural Gas, UncategorizedTagged with: , , ,

EPIS Releases New Version of AURORA

Version 12.3 introduces significant enhancements

Las Vegas, Nevada – April 25, 2017 — EPIS (www.epis.com), the market leader in power market
simulation, forecasting and analysis, announced the release of version 12.3 of its AURORA
software at the Platts Global Power Markets™ Conference. The latest version boasts a number
of enhancements to storage logic, ancillary services, long-term logic, improved RPS modeling
and nodal capabilities.

AURORA 12.3 further solidifies its position as the most valuable power market forecasting
and analysis software on the market today. It is fast, easy to use, and transparent. Upgrades in
the new version include:

  • Enhanced Storage Logic—improved ability to model the intricacies of renewable and
    storage integration, electric vehicles, and other technologies.
  • Ancillary Services Enhancements—significant enhancements, including sub-hourly
    dispatch and use in nodal studies, and improved MW reporting for simultaneous
    contributions to multiple products
  • Improved RPS Modeling—offers new option to identify resources not eligible to set
    capacity prices– especially useful when modeling RPS policies where renewable
    resources must be built but cannot participate in capacity markets. Also, RPS constraints
    can now be input as a percentage of demand or MWh value, giving more flexibility to
    specifying RPS targets over time.
  • Long-Term (LT) Capacity Expansion Logic Enhancements—now have the option to
    change dispatch-hour sampling dynamically—accelerating studies, but still providing
    detail on final production run.
  • New LT Constraint Types—including capacity and energy max limits, which provide
    more flexibility for build decisions to targets in LT studies.
  • New LT Reporting Option—new build report output table making it helpful to quickly
    see which constraints were binding (min, max by technology/fuel/area).
  • Nodal SCUC—version 12.3 also includes an exciting new option to run a full security
    constrained unit commitment (SCUC). The mixed-integer program that performs the
    commitment decisions, now accounts for nodal constraints, including branch, corridor,
    and contingency constraints. The new SCUC ability is in addition to a new, proprietary
    solving method that significantly speeds nodal analysis.

AURORA v.12.3 is further enhanced by the proven and calibrated databases that either
come with the license or as an add-on, including: U.S.-Canada, Europe or Mexico. The calibrated
datasets simplify meaningful forecasting. All AURORA databases include a base-case 25-year
power price forecast and generator capacity expansion and retirement plan. The sources and
procedures used to update the data are thoroughly documented. Updates to the databases are
provided under the annual AURORA license.

For the past 20 years, AURORA has had a reputation for being best-in-class, with unmatched
support. Version 12.3 further establishes its position as the leader in power market forecasting
and analysis.

Filed under: EventsTagged with: , , , , , , ,

Top 10 Pieces of Advice from AURORAxmp Support Experts

Recently, Power Market Insights asked EPIS’s support team to share their top tips for making the most of AURORAxmp. EPIS is known for having best-in-class support and the experts on the team had some very useful advice to share.

Using Both AURORAxmp’s Help Feature and the Website’s Knowledge Base

  1. Take advantage of context-sensitive Help. One very useful feature, especially when first learning the model is the context-sensitive Help. You can always learn more about a specific form or column/table in the model by selecting it and pressing F1 on your keyboard. The Help document contains a wealth of information about all aspects of the model and how it works, making it a valuable reference for users from beginners to experts.
  2. Utilize the online Knowledge Base. Although Help is an excellent way to familiarize yourself with the nuts and bolts of AURORAxmp, the online Knowledge Base on the Support website contains a catalog of presentations on topics that can help you learn the model faster. You can find a compilation of presentations that were given at past conferences, like our annual Electric Market Forecasting Conference or our Spring Group Trainings, not only from EPIS employees but other AURORAxmp users as well. Many of the presentations give step-by-step examples on how to set up different inputs in the model. Using the Knowledge Base alongside the Help document is a great way for a user to thoroughly understand specific areas of AURORAxmp.

Working with AURORAxmp Inputs and Outputs

  1. Be judicious with output reporting. Output databases can grow quickly, which can also increase runtimes. Be sure to limit reporting to just the data you need by using the Report column, available in most input tables. By setting the Report Column to TRUE and de-selecting the All Items box in Run Setup > Reporting form for that output table, you can limit output to just the items you are most interested in. Couple this with the Custom Columns feature, where only the columns you need are reported and you’ll have a perfectly tailored output database.
  2. Take advantage of the dbCompare tool. You can compare either Input or Output databases and then save the results to Excel. In both cases, you can keep a permanent record of the differences, without having to review multiple change sets or manually compare outputs.
  3. Avoid errors due to Improper permissions. Check to make sure that your folder permissions are correct, which will save time in the long run. In some IT environments with enhanced user access security, it may be necessary for your IT team to give you additional rights to certain folders on the system. Contact our Support Team to find out which folders need read and write access.
  4. Test changes in small batches. Take a look at the Data Management article in the Help’s Knowledge Base. When planning to make large sets of changes, it is wise to test them in small batches.  Specifically, perform a short AURORAxmp run after each batch to ensure data was entered properly and is flowing through the model as intended. It is simple to set your period/hours to something very short and fast and direct output to a temporary database. This practice alone can save significant time and effort in tracing troublesome input data.

Managing, Saving, Authenticating

  1. Use Tab My Forms. Many people are unaware of the Tab My Forms option which can help organize multiple AURORAxmp windows on their screens. It can be found under Tools > Options > User Preferences. Along the same lines, if you right-click on a tab, you can select Close All But This to help clean-up your screen when you have too many tabs open.
  2. Create an archive of your project. If you think you need to replicate the results in the future, create an archive. They are great for packaging all the file components of your project into a single .zip file and can easily be transferred to colleagues or used to store a project that you may need to revisit in the future. Once an archive is opened, the project contains everything you need to replicate the output—the same database, change sets, and project settings.  Once unarchived, you simply have to hit Run to replicate the output. This can come in handy if you are asked to replicate output or verify input parameters and run settings.
  3. Know your SQL Server authentication options. AURORAxmp supports two methods to authenticate with your SQL Server: Windows Active Directory-based and SQL Server-based. Windows Active Directory is typically used when individual users are writing output that doesn’t need to be accessed or modified by other users in the organization. SQL Server authentication is best used when output files are going to be shared by multiple users. In this case, some organizations prefer to use a common, single, SQL Server username for multiple users to share.

Hardware

  1. Understand which computer hardware is best. Considering new hardware? AURORAxmp runs best on physical hardware with fast RAM, a fast CPU and speedy disks. Low latency RAM and a good memory controller seem to have the greatest impact on runtime, followed closely by a fast CPU. While AURORAxmp will take advantage of threading in a variety of places, for a single case fast, single-threaded CPUs with a high clock speed seem to perform best. The fastest AURORAxmp runtimes have been observed on overlocked physical hardware with low latency RAM.

Of course, the support experts at EPIS can help with any questions or issues you may have. Next time you talk to one of them, be sure to ask them about tips and tricks to maximizing the power of AURORAxmp.

What’s your favorite trick or tip? Share it in the comments section.

Filed under: SupportTagged with: , , , ,

Reserve Margins

Discussing reserve margins is often convoluted because of the various definitions and intricacies.  The basic principle is that reserve capacity is used to ensure adequate power supply.  Different types of reserves are defined in terms of various time scales.  In the short-term, operating reserves are used to provide adequate supply in the case of sudden plant or transmission outages.  In the long-term, planning reserves are used to ensure adequate power supply given a forecasted load in the years ahead.  Both types of reserves are often expressed as a ratio of excess capacity (i.e., available capacity less demand) to demand.  In this blog post, we will discuss planning reserves; the typical values, historical trends, market-to-market differences, and modeling within AURORAxmp.

Planning Reserves

Without adequate planning reserves, new generation may not be built in time and thus ultimately cause power disruptions.  But what is adequate?  In 2005, Congress passed The Energy Policy Act of 2005 that requires the North American Reliability Corporation (NERC) to assess the reliability of the bulk power system in North America.  A part of NERCs responsibility is to periodically publish Long-Term Reliability Assessments (LTRA) which include planning reserve targets, or reference margins.  Usually these are based on information provided by each governing body (e.g., ISO, RTO, etc.) in the assessment area.  If no such information is available, NERC sets the reference margin to 15% for thermal-dominated systems and 10% for hydro-dominated systems.  For the 2015 LTRA, the NERC reference margins range from 11% to 20% across the assessment areas as shown in Figure 1.  The highest reference margin, 20% for NPCC Maritimes, is due to a disproportionate amount of load being served by large generating units.

NERC reference margins graph

Figure 1. 2016 Planning reserve margins by NERC assessment area from the 2015 LTRA.
The gold bars represent assessment areas with capacity markets.

In addition to providing reference margins, or published targets from other entities, NERC publishes yearly anticipated planning reserve margins, out 10 years, for 21 assessment areas in North America.  To do this, NERC collects data on peak demand and energy, capacity, transmission and demand response from NERC regional entities.  Data submission is usually due in the first quarter of the report year.  This strategy represents a bottom-up approach to understanding reliability.

Forecasting Anticipated Planning Reserve Margins

Forecasted anticipated planning reserve margins can vary substantially from assessment year to assessment year, area to area, and as a function of markets.  To illustrate this, one-, five-, and 10-year forecasted anticipated planning reserve margins for PJM and ERCOT are shown in Figure 2.  The variability in anticipated planning reserve margin is similar between each assessment area, and, increases with the length of the forecast.  This is presumably due to increasing uncertainty in forecasts as a function of time.  Interestingly, the number of years with shortfalls (fewer reserves than the target) is much larger in ERCOT than PJM.  PJM has a three-year forward capacity market and ERCOT is an energy only market.  Therefore, there is more incentive for long-term excess capacity in PJM.

reserve margins

Figure 2. Planning reserve margins targets (dashed line) and one-, five-, and 10-year anticipated planning reserve margin from the 2011 to 2015 NERC LTRAs.

As shown above, in both ERCOT and PJM, the year-ahead anticipated planning reserve margins are adequate, suggesting long-term planning approaches are working in both markets, however, regional complexities can pose problems.  For example, MISO recently published the 2016 Organization of MISO States (OMS) Survey to assess planning reserve margins.  In 2017, shortfalls are predicted in three zones – IL, MO, and Lower MI.  Excess capacity from other zones will be transferred to make up for the shortfall in the short term.  Similar to the NERC forecasts, uncertainty in the regional forecasted load is key to this issue, and may increase or decrease this shortfall.

In addition to regional issues, the rapid changing generation mix also poses challenges for quantifying adequate planning reserves.  NERC has recognized this and has called for new approaches for assessing reliability in both the 2014 and 2015 LTRA.  One specific issue is traditional load shape disruption with added solar resources.  A typical summer-peaking system may face reliability issues in the winter or other expected off-peak months where demand still is high but solar output is low.  Considering secondary demand peaks, and thus planning reserve margins, may be prudent in these situations.

AURORAxmp and Planning Reserve Margins

In AURORAxmp, planning reserve margins are used in the long-term capacity expansion logic to guide new resource builds.  Our Market Research and Analysis team updates planning reserve margins annually based on the latest NERC LTRA.  Planning reserve margins can be specified on the pool or zone level, thus easily facilitating varying spatial scale studies.   Risk studies can be conducted to quantify the impacts of uncertainty in each aspect of planning reserve margins on long-term resource builds.  Together these features support cutting-edge analysis surrounding the complexities of reserves.

Filed under: Power Market InsightsTagged with: ,

19th Annual Electric Market Forecasting Conference to Focus on the Future of Energy Markets

The 2016 Electric Market Forecasting Conference (EMFC), a leading gathering of industry strategists and executives, will feature in-depth discussions on the driving forces of today’s energy markets. The 19th annual conference, organized by EPIS, LLC, will bring together a stellar lineup of speakers as well as senior executives in the industry.  The EMFC will be held at the Atlanta Evergreen Marriott Conference Resort in Atlanta, Georgia, September 14-16, 2016.

golfcourse2

The EMFC features an optional one-day pre-conference training for both new and advanced power market modelers, as well as an AURORAxmp Users’ Group Meeting. Both clients and non-clients are welcome to attend. The two-day conference will include presentations and case studies from industry experts, as well as special events and networking opportunities. Speakers include: Larry Kellerman, managing partner of Twenty First Century Utilities, Morris Greenberg, managing director of gas and power modeling at PIRA Energy Group and Jeff Burleson, VP of system planning at Southern Company. A full list of speakers is available at http://epis.com/events/2016-emfc/speakers.html.

“Over the past 19 years, the Electric Market Forecasting Conference has become established as a valuable, strategic gathering for clients and non-clients alike,” said Ben Thompson, CEO of EPIS. “It is an event where executives and peers in the industry gather to share market intelligence and discuss the future of the industry.”

EMFC has developed a reputation for being an event that delivers real, actionable intelligence, not just abstract concepts. The organizers focus on an agenda filled with speakers who can share experience and takeaways that can be used to have a positive impact on attendees’ organizations. The conference’s intimate environment allows participants to create lasting relationships with peers and luminaries alike.

Now in its 19th year, EMFC is an essential conference for power industry professionals to come together to share best practices and market intelligence. The one-day pre-conference allows AURORAxmp users to learn techniques to master the AURORAxmp application and maximize ROI. More information can be found at: http://epis.com/events/2016-emfc/index.html.

Filed under: EventsTagged with: , , , , ,

The Algorithms at the Core of Power Market Modeling

In 2007, the U.S. government formed the Advanced Research Projects Agency-Energy (ARPA-E) which encourages research on emerging energy technologies. Last year this agency awarded about 3.1 million dollars to the Pacific Northwest National Laboratory (PNNL) to work on a computational tool called High-Performance Power-Grid Operation (HIPPO) over the next few years. The research team will be led by an applied mathematician at PNNL and be partnered with GE’s Grid Solutions, MISO, and Gurobi Optimization. The group will seek improved ways to solve the unit commitment problem, “one of the most challenging computational problems in the power industry.” The work highlights the general trend over the past twenty years in this and other industries to turn to mathematical optimization for answers to some of the most difficult scheduling and planning problems. What’s astounding is the rate at which commercial mathematical solvers have been able to respond to these needs with enormous leaps in algorithmic efficiency over a relatively short period of time.

At the core of most of the mathematical optimization used in power modeling is linear programming (LP). Linear programs are problems in which some linear function is maximized or minimized given a set of linear constraints. The mathematician George Dantzig invented the simplex algorithm in 1947 in advance of the day when computers could really take advantage of it. For example, in 1953 one implementation of the algorithm on a Card Programmable Calculator (CPC) could solve a certain 26 constraint, 71 variable instance of the classic Stigler Diet Problem in about eight hours. As computer technology advanced, though, the usefulness and power of the simplex algorithm specifically and linear programming in general became apparent. Advances in the algorithm combined with exponential computer speed improvements made linear programming a staple in problem solving by the early 2000s. In fact, algorithmic progress in linear programming (i.e. independent from computer speed improvements) gave a 3300x improvement factor from 1988 to 2004. Coupled with actual computer machine improvements of 1600x in that same time horizon, this produced a 5,280,000x average improvement for solving linear programs!

While progress on linear programs has somewhat plateaued in recent years, improvements in mixed-integer programming (MIP) have continued at impressive rates. In its simplest form, a mixed-integer program is a linear program for which some of the variables are restricted to integer values. This integer-value restriction makes the problem so difficult that it is NP-hard, meaning that finding a guaranteed polynomial time algorithm for all MIPs will most likely never occur. And yet the MIP is at the center of an ever-increasing number of practical problems like the unit commitment problem that the HIPPO tool mentioned above is meant to address, and it is only relatively recently that it really became a practical problem solving tool. According to one expert and active participant in the field, Robert Bixby,

“In 1998 there was a fundamental change in our ability to solve real-world MIPs. With these developments it was possible, arguably for the first time, to use an out-of-the box solver together with default settings to solve a significant fraction of non-trivial, real-world MIP instances.”

He provided this chart showing the improvements in one MIP solver, CPLEX, from 1991 to 2007:

Cplex version-to-version pairs
Figure 1. CPLEX Version-to-Version Pairs. Source

This chart shows that over approximately 16 years, the machine-independent speed improvement was roughly 29,000x! The progress on developing fast algorithms to solve (or at least find good solutions to) mixed-integer programs has been simply explosive.

The importance of this development is highlighted by extensive use of MIPs by regional reliability organizations in the United States. An independent review published by the National Academies Press states that:

In the day-ahead time frame, the CAISO, ERCOT, ISO-NE, MISO, PJM, and SPP markets employ a day-ahead reliability unit commitment process… The optimization for the day-ahead market uses a dc power flow and a mixed integer program for optimization.

In other words, the MIP is at the core of day-ahead market modeling for these major reliability organizations. A presentation given a few years back by PJM shows their increasing need to solve very difficult MIPs in a shorter time frame. The presentation highlights the fact that PJM has a “major computational need” for “better, faster MIP algorithms and software.” The short slide deck states three times in different contexts the need in PJM for “even faster dynamic MIP algorithms.” The entity must solve their day-ahead model for the security constrained unit commitment (SCUC) problem in a four-hour window towards the end of each day, and the presentation explains that they “have a hard time solving deterministic SCUC models in the time allotted.” So the need for ever-improving mixed-integer programs in the energy industry doesn’t seem to be going away any time soon. And with the increasing complexity of problems such as renewable integration, sub-hourly modeling, and the handling of stochastics, the push for “better, faster MIP algorithms” will only continue.

So what does all of this mean for power modelers? Professional solvers’ ability to continue to improve LP/MIP algorithms’ performance will determine whether the most difficult questions can still be addressed and modeled. But, in addition to that, it is crucial that the simulation models that seek to mimic real-world operations with those solvers are able to intelligently implement the fastest possible optimization codes. As EPIS continues to enhance AURORAxmp, we understand that need and spend an enormous amount of time fine-tuning the LP/MIP implementations and seeking new ways to use the solvers to the greatest advantage. Users of AURORAxmp don’t need to understand those implementation details—everything from how to keep the LP constraint matrix numerically stable to how to pick between the interior point and dual simplex LP algorithms—but they can have confidence that we are committed to keeping on pace with the incredible performance improvements of professional solvers. It is in large part due to that commitment that AURORAxmp has also consistently improved its own simulation run time in significant ways in all major releases of the past three years. With development currently in the works to cut run times in half of the most difficult DC SCOPF simulations, we are confident that this trend will only continue in the coming years with future releases of AURORAxmp. As was said about the projected future development of mixed-integer programs, the performance improvement “shows no signs of stopping.”

 

Filed under: Data Management, Power Market InsightsTagged with: , ,

Living in the Past?

Living in the past is not healthy. Is your database up-to-date? EPIS just launched the latest update to the North American Database, version 2016_v4, marking the fourth North American data update this year! Recent changes in the power industry present challenges to database management which will be discussed in this post.

In general, the transformation in power generation sources in the U.S. coupled with evolving electricity demand and grid management represents a paradigm shift in the power sector. In order to accurately model power prices in the midst of such change, one must have a model built on fundamentals and a database that is up-to-date, has reasonable assumptions, is transparent and is flexible. A recent post described the technical side of working with databases in power modeling. This entry outlines important changes in the East Interconnect, the impacts those changes have on data assumptions and configuration and the steps we are taking to provide excellent databases to our clients.

Recent shifts in power generation sources challenge database assumptions and management. New plant construction and generation in the U.S. are heavily weighted towards renewables, mostly wind and solar and as a result, record generation from renewables has been reported across the East Interconnect. Specifically, on April 6, 2016, the Southwest Power Pool (SPP) set the record for wind penetration:

Record Wind Penetration Levels 2015

Figure 1. Record wind penetration levels in Eastern ISOs compared with average penetration in 2015. SPP holds the record which was reported on April 6, 2016. Record sources: NYISO, SPP, MISO, ISO-NE, PJM. 2015 Averages compiled from ISO reports, for example: NYISO, SPP, MISO, ISO-NE, PJM. *Average 2015 generation used to calculate penetration.

Similarly, the New York City area reached a milestone of over 100 MW in installed solar distributed resources. Accompanying the increase in renewables are increases in natural gas generation and reductions in coal generation. In ISO-NE, natural gas production has increased 34 percent and coal has decreased 14 percent since 2000, as highlighted in their 2016 Regional Electricity Outlook. These rapid changes in power generation sources require frequent and rigorous database updates.

Continued electric grid management changes in the East Interconnect also requires flexibility in databases. One recent change in grid management was the Integrated System joining the Southwest Power Pool, resulting in Western Area Power Administration’s Heartland Consumers Power District, Basin Electric Power Cooperative and Upper Great Plains Region joining the RTO. The full operational control changed on October 1, 2015, thus expanding SPPs footprint to 14 states, increasing load by approximately 10 percent and tripling hydro capacity. Grid management change is not new, with the integration of MISO South in 2013 as an example. Changes such as these require flexibility in data configuration that allow for easy restructuring of areas, systems and transmission connections.

Variability in parameters, such as fuel prices and demand, introduce further difficulty in modeling power markets. The so called “Polar Vortex” weather phenomena shocked North Eastern power markets in the winter of 2013/2014 with cold temperatures and high natural gas prices resulting in average January 2014 energy prices exceeding $180/MWh in ISO-NE. It seemed like the polar opposite situation occurred this last winter. December 2015 was the mildest since 1960, and together with low natural gas prices, the average wholesale power price hit a 13-year low at $21/MW. The trend continued into Q1 of 2016:

Monthly average power price in ISO-NE Q1 2014 and 2016

Figure 2. Monthly average power price in ISO-NE in Q1 2014 and 2016. Variability between years is a result of high natural gas prices and cold weather in 2014 versus low natural gas prices and mild weather in 2016.

Whether extreme events, evolving demand or volatile markets, capturing uncertainty in power modeling databases is challenging. In AURORAxmp, users can go one step further by performing risk simulations; specifying parameters such as fuel prices and demand to vary across a range of simulations. This is a very powerful approach to understanding the implications of uncertainty within the input data.

The aforementioned changes in generation, grid management and demand, offer exciting new challenges to test power market models and data assumptions. To test our platform, EPIS performs a historical analysis as a part of each database release. Inputs of historical demand and fuel prices are used to ensure basic drivers are captured and model output is evaluated not only in terms of capacity, but monthly generation, fuel usage and power prices. The result of this process is a default database that is accurate, current, contains reasonable assumptions, is transparent and is flexible to ensure you have the proper starting point for analysis and a springboard for success.

With the release of North_American_DB_2016_v4, EPIS continues to provide clients with superb data for rigorous power modelling. The 2016_v4 update focuses on the East Interconnect and includes updates to demand, fuels, resources, DSM and other miscellaneous items. Clients can login to our support site now to download the database and full release notes. Other interested parties can contact us for more information.

Filed under: Data Management, Power Market InsightsTagged with: , , , , , ,

The Fundamentals of Energy Efficiency and Demand Response

What are Energy Efficiency & Demand Response Programs?

Though the Energy Information Administration states, “there does not seem to be a single commonly-accepted definition of energy efficiency,” efficient energy use, sometimes simply called energy efficiency, refers to the reduction in the amount of energy required to provide the equivalent quality of products and services. Examples include improvements to home insulation, installation of fluorescent lighting & efficient appliances, or improving building design to minimize energy waste.

Demand response, according to the Department of Energy, is defined as, “a tariff or program established to motivate changes in electric use by end-use customers in response to changes in the price of electricity over time, or to give incentive payments designed to induce lower electricity use at times of high market prices or when grid reliability is jeopardized.” Utilities can signal demand reduction to consumers, either through price-based incentives or through explicit requests. Unlike energy efficiency, which reduces energy consumption at all times, demand response programs aim to shift load away from peak hours towards hours where demand is lower.

What are the Benefits of Energy Efficiency & Demand Response Programs?

The decreasing and ‘flattening’ of the demand curve can directly contribute to improved system and grid reliability. This ultimately translates to lower energy costs, resulting in a financial cost saving to consumers, assuming the energy savings are greater than the cost of implementing these programs and policies. In 2010, Dan Delurey, then president of the Demand Response and Smart Grid Coalition, pointed out that the top 100 hours (or just over 1% of the hours in a year) account for 10-20% of total electricity costs in the United States. Slashing energy consumption during these high peak hours, or at least shifting demand to off-peak hours, relieves stress on the grid and should make electricity cheaper.

Additionally, decreasing energy consumption directly contributes to the reduction of greenhouse gas emissions. According to the International Energy Agency, improved energy efficiency in buildings, industrial processes and transportation prevented the emission of 10.2 gigatonnes of CO2, helping to minimize global emissions of greenhouse gases.

Lastly, reductions in energy consumption can provide domestic benefits in the forms of avoided energy capital expenditure and increased energy security. The chart below displays the value of avoided imports by country in 2014 due to the investments in energy efficiency since 1990:

Added Volume and Value of Imports Figure 1: Avoided volume and value of imports in 2014 from efficiency investments in IEA countries since 1990. Source

Based on these estimated savings, energy efficiency not only benefits a country’s trade balance, but also reduces their reliance on foreign countries to meet energy needs.

Modeling the Impacts of Energy Efficiency and Demand Response

Using AURORAxmp, we are able to quantify the impact of energy efficiency and demand response programs. In this simple exercise, we compare the difference between California with 2 GW of energy efficiency and 2 GW of demand response versus a case without energy efficiency or demand response from 2016 to 2030. The charts below show the average wholesale electricity prices & system production costs:

average electricity price $-MWhAverage System Cost (000's)

 Figure 2: Note these are 2014 real dollars.

Holding all else equal, adding demand response and energy efficiency programs into the system decreased average wholesale electricity prices by about $2.88 (5.4%) and the average system production cost fell by $496,000,000 (5.1%). This is a simple example in one part of the country, but one can easily include additional assumptions about the grid, resources characteristics, and load shape as they desire.

Both demand response and energy efficiency programs are intended to be more cost effective and efficient mechanisms of meeting power needs than adding generation. Emphasis on the demand side can lead to lower system production costs, increased grid reliability, and cheaper electric bills; all of which lie in the best interest of governments, utilities, and consumers.

Filed under: Energy Efficency, Power Market InsightsTagged with: , , , , ,

Working With Data in Power Modeling

How Much Data Are We Talking About?

When planning the deployment of a power modeling and forecasting tool in a corporate environment, one of the most important considerations prior to implementation is the size of the data that will be used. IT personnel want to know how much data they are going to be storing, maintaining, backing up, and archiving so they can plan for the hardware and software resources to handle it. The answer varies widely depending on the types of analysis to be performed. Input databases may be relatively small (e.g. 100 megabytes), or they can be several gigabytes if many assumptions require information to be defined on the hourly or even sub-hourly level. Output databases can be anywhere from a few megabytes to several hundred gigabytes or even terabytes depending on what information needs to be reported and the required granularity of the reports. The data managed and stored by the IT department can quickly add up and become a challenge to maintain.

Here are a couple example scenarios:

A single planning analyst does a one-year hourly run (8760 hours) with modest reporting, which produces an output database of 40 MB. On average, the analyst runs about six studies per day over 50 weeks and the total space generated by this analyst is a modest 75GB. This is totally manageable for an IT department using inexpensive disk space.

Now, let’s say there are five analysts, they need more detailed reporting, they are looking at multiple years, and a regulatory agency states that they have to retain all of their data for 10 years. In this scenario, the total data size jumps to 500 MB for a single study. Given the same six studies per day those analysts would accumulate 3.75 TB of output data in a year, all needing to be backed up and archived for the auditors, which will take a considerable amount of hardware and IT resources.

What Are My Database Options?

There are dozens of database management systems available. Many power modeling tools support just one database system natively, so it’s important to know the data limitations of the different modeling tools when selecting one.

Some database systems are file-based. For example, one popular file-based database system is called SQLite. SQLite is fast, free, and flexible. This file-based database system is very efficient and is fairly easy to work with, but is best suited for individual users, as are many other file-based systems. These systems are great options for a single analyst working on a single machine.

As mentioned earlier, groups of analysts might decide to all share a common input database and write simultaneously to many output databases. Typically, this requires a dedicated server to handle all of the interaction between the forecasting systems and the source or destination databases. Microsoft SQL Server is one of the most popular database systems available in corporate environments, and the technical resources for it are usually available in most companies. Once you have your modeling database saved in SQL Server, assuming your modeling tool supports it, you can read from input databases and write to databases simultaneously and share the data with other departments with tools that they are already familiar with.

Here is a quick comparison of some of the more popular database systems used in power modeling:

Database System DB Size Limit (GB) Supported Hardware Client/Server Cost
MySQL Unlimited 64-bit or 32-bit Yes Free
Oracle Unlimited 64-bit or 32-bit Yes High
MS SQL Server 536,854,528 64-bit Only (as of 2016) Yes High
SQLite 131,072 64-bit or 32-bit No Free
XML / Text File OS File Size Limit 64-bit or 32-bit No Free
MS SQL Server Express 10 64-bit or 32-bit Yes Free
MS Access (JET)* 2 32-bit Only No Low

A Word About MS Access (JET)*

In the past, many Windows desktop applications requiring an inexpensive desktop database system used MS Access database (more formally known as the Microsoft JET Database Engine). As hardware and operating systems have transitioned to 64-bit architectures, the use of MS Access database has become less popular due to some of its limitations (2GB max database size, 32,768 objects, etc.), as well as to increasing alternatives. Microsoft has not produced a 64-bit version of JET and does not have plans to do so. There are several other free desktop database engines available that serve the same needs as JET but run natively on 64-bit systems, including Microsoft SQL Server Express, SQLite, or MySQL which offer many more features.

Which Databases Does AURORAxmp Support?

There are several input and output database options when using AURORAxmp for power modeling. Those options, coupled with some department workflow policies, will go a long way in making sure your data is manageable and organized.

EPIS delivers its native AURORAxmp databases in a SQLite format which we call xmpSQL. No external management tools are required to work with these database files – everything you need is built into AURORAxmp. You can read, write, view, change, query, etc., all within the application. Other users with AURORAxmp can also utilize these database files, but xmpSQL doesn’t really lend itself to a team of users all writing to it at the same time. Additionally, some of our customers have connected departments that would like to use the forecast data outside of the model, and that usually leads them to Microsoft SQL Server.

For groups of analysts collaborating on larger studies, AURORAxmp supports SQL Server database, although its use isn’t required. Rather than use SQL Server as the database standard for AURORAxmp (which might be expensive for some customers), the input databases are delivered in a low cost format (xmpSQL), but AURORAxmp offers the tools to easily change the format. Once the database is saved in SQL Server, you are using one of the most powerful, scalable, accessible database formats on the planet with AURORAxmp. Some of our customers also use the free version of SQL Server – called SQL Server Express Edition – which works the same way as the full version, but has a database size limit of 10GB.

Some additional options for output databases within AURORAxmp are:

MySQL: Open source, free, server-based, simultaneous database platform that is only slightly less popular than SQL Server.
XML/Zipped XML: A simple file-based system that makes it easy to import and export data. Many customers like using this database type because the data is easily accessed and is human readable without additional expensive software.
MS Access (JET) : The 32-bit version of AURORAxmp will read from and write to MS Access databases. EPIS, however, does not recommend using it given the other database options available, and due to its 2 GB size limitation. MS Access was largely designed to be an inexpensive desktop database system and given its limitations as previously discussed, we recommend choosing another option such as xmpSQL, SQL Server Express or MySQL which offer far more features.

Where Do We Go From Here?

AURORAxmp is a fantastic tool for power system modeling and forecasting wholesale power market prices. It has been in the marketplace for over twenty years, and is relied upon by many customers to provide accurate and timely information about the markets they model. However, it really can’t do anything without an input database.

EPIS has a team of market analysts that are dedicated to researching, building, testing, and delivering databases for many national and international power markets. We provide these databases as part of the license for AURORAxmp. We have many customers that use our delivered databases and others who choose to model their own data. Either way, AURORAxmp has the power and the flexibility to utilize input data from many different database types.

If you are just finding AURORAxmp and want to see how all of this works, we have a team here that would love to show you the interface, speed and flexibility of our product. If you are already using our model but would like guidance on which database system is best for your situation, contact our EPIS Support Team and we’ll be glad to discuss it with you.

Filed under: Data Management, Power Market InsightsTagged with: , , ,