In-Home Storage: The Virtual Power Plant


Rapid Growth

Solar and wind are considered the most popular renewable resources across the world, but due to their intermittent and unpredictable nature, utilities are still relying on natural gas and coal. However, when renewable technologies are combined with energy storage they smooth out load fluctuations and have the potential to significantly impact the generation mix.
Total energy storage deployment has increased dramatically in the past few years because of low-carbon, clean energy policies, and is anticipated to grow even more in the near-term. By 2022, GTM Research expects the U.S. energy storage market to reach 2.5 GW annually, with residential opportunities contributing around 800 MW.


Source: GTM Research

How Does It Work?

Energy storage works as a three-step process that consists of extracting power from the grid, solar panels, or wind turbines, storing it (charging phase) during the off-peak period when power prices are lower, and returning it (discharging period) at a later stage during the on-peak period when the prices are much higher.


For electric vehicles (EV), most of the charging happens at night and during weekends, when the prices are comparatively lower, and vehicles are not used that much. As EVs continue to enter the mainstream market, they would increase the off-peak prices and contribute to load shifting.
Energy storage devices and EVs can complement each other or they may be competitive. But energy storage is the key element for EV charging during on-peak hours.

Different Market Players

Residential energy storage has been a holy grail for companies like Tesla, Panasonic, LG, Sunverge Energy, and Orison with lithium ion (Li-ion) batteries as the leading technology type. Now with plug-in electric and hybrid vehicles on the rise, automobile companies Tesla, Nissan, Mercedes Benz, BMW, Renault and Audi have also joined the residential market to integrate EV charging stations, battery storage and rooftop solar that in essence has a residence operating as a virtual power plant.
Beginning in December of last year, Arizona Public Service Company deployed Sunverge Energy’s energy storage hardware coupled with advanced, intelligent energy management systems that predict future load requirements and solar generation. Additionally, Tesla is enjoying significant market share, shown recently by Vermont-based Green Mountain Power’s launch of a comprehensive solution to reduce customer electricity bills using Tesla’s cutting edge Powerwall 2 and GridLogic software.
A few other utility companies, especially in Florida and California, are also exploring residential energy storage programs, as shown in the figure below.


Source: Hawaii PUC; General Assembly of Maryland

So, what are some other current thoughts about the pros and cons of in-home energy storage?

  • Energy storage reduces load fluctuations by providing localized ramping services for PV and ensuring constant, combined output (PV plus storage).
  • Improves demand response and reduces the peak demand.
  • Extra savings for customers through net metering systems and end-user bill management.
  • Reduces reliance on the grid; the customer can generate and store the energy during severe outages also.
  • Disposal of Li-ion batteries is not easy, and they are difficult to recycle
  • Automakers, like Nissan and BMW, are implementing second-life batteries, thereby reducing the durability and reliability of the product.


Concluding Thoughts

Clearly, a wider acceptance of energy storage resources would be a game changer in the U.S. power sector. Utilities, consumers, and automakers are profiting from this exponential growth of energy storage. With an increasing number of companies using artificial intelligence and machine learning algorithms for energy management systems, the synergy with energy storage creates a perfect, smart, personal power plant which has tremendous potential to change the landscape of the energy industry.

Filed under: Clean Power Plan, Hydro Power, Power Grid, Power Market Insights, Power Storage, Renewable Portfolio Standards, Renewable Power, Solar Power, UncategorizedTagged with: , , , , , , , , , , , , , , , , ,

Artificial Intelligence and the Future of the Power Grid


Artificial intelligence (AI) has become one of the fastest growing tech sectors, with over five billion dollars invested in AI startups.  Despite Elon Musk’s warnings about its dangers, AI is rapidly advancing and is expected to play a major role in our lives in transportation, healthcare, security, and other sectors.  Artificial intelligence—the ability of machines to perform cognitive functions normally associated with the human mind—has seen enormous advances in the past few years due to a type of AI called deep learning.  And the prevalence of artificial intelligence can already be seen in many everyday experiences; for example, when Facebook automatically recognizes faces in uploaded photos or when Apple’s Siri answers your question, AI is at work.

One of the industries where artificial intelligence is making important inroads is the electricity sector.  On the supply side, numerous companies are using AI to improve power production efficiency.  For example, earlier this year, GE announced AI related technology for wind turbines in Japan expected to increase power output by 5% and lower maintenance costs by 20%.  On the solar front, NEXTracker uses machine learning in its solar trackers, which can increase production by up to 6%.  And AI is not just for renewable resources: Siemens uses artificial intelligence algorithms to improve combustion efficiency, reduce emissions, and lower the wear on gas turbines.  UK-based EDF Energy is testing machine learning to predict demand for the next day more accurately than humans, resulting in energy saving in cogeneration plants up to 15%.  Finally, some coal-fired plants have used AI to increase efficiency and reduce emissions.  For example, Xcel Energy has implemented sophisticated artificial neural networks to make recommendations on how to adjust operations in order to reduce emissions in its Texas coal plants.  Clearly, AI is set to have a significant impact on how power plants operate in the future.

Artificial intelligence also has the potential of making a substantial difference in helping balance demand and supply of the electricity sector as well.  The recent rise of renewable energy, from both power plants and distributed generation, has caused its share of challenges to the power grid for producers, utilities, and consumers.  To help on the consumer side, in the town of Reidholz, Switzerland, forty homes are trying a new technology called Gridsense so that AI can improve how power is used within homes and helping ensure that “the power grid is always operating at optimal load” by adjusting customer energy consumption and coordinating with the photovoltaic generation in the neighborhood.  On a larger scale, Google’s DeepMind is in discussion with one of the UK’s energy providers, National Grid, to use artificial intelligence on their power grid to help balance supply and demand.  DeepMind has already used its program at Google’s data centers to cut electricity usage by 15%.

Another area where AI has the potential of making a big difference on the grid is in the control and operation of demand response.  This is where large consumers of electricity are rewarded when decreasing their energy requirements on short notice to help balance the grid, and this can be cheaper for the operators than turning on very expensive power plants.  Demand response programs have existed for some time now, and improving AI technology may provide significant benefits to consumers hoping to optimize their participation in the program.  As one source states, “Demand management is also seeing an explosion of AI activity with use cases covering areas such as demand response, building energy management systems, overall energy efficiency and DR game theory.”  One company, Upside, is using AI to manage a portfolio of storage assets to provide real-time energy reserves to relieve stress on the grid.  It has developed an Advanced Algorithmic Platform that manages demand response of different devices to be run in parallel.  Another company, Open Energi, uses AI to optimize companies’ assets to save energy and cut costs by choosing what time to run them based on supply and demand fluctuations in the power grid.

The use of Artificial Intelligence is already at work improving efficiency in the electricity sector for power plants, grid operators, and both large and small consumers.  Whatever lies in the future for the power industry, signs are promising that artificial intelligence will play an essential role in improving the overall efficiency on the grid.

Filed under: Artificial Intelligence, Energy Efficency, Power GridTagged with: , , , ,

Evolution of the Mexican Energy Market


The Mexico energy reform began in 2013 with the goals of creating an open, transparent and competitive market to reduce electricity prices, increase reliability, and meet clean energy goals. To meet these goals, the state-ran generation owner, CFE, was split into six companies to ensure fair competition; an independent system operator, CENACE, was formed; and auctions for reliability and clean energy have been conducted. As a part of the broader reform, the Energy Transition Law established the requirement for long-term planning (Program of Development of the National Electric System or PRODESEN) and directly mandated clean energy goals. The latest PRODESEN was published in June 2017 and contains even more evidence towards the rapid evolution of the Mexican Energy Market.

The strong demand growth forecasted in Mexico over the next 15 years has decreased with each of the last three PRODESENs. In the latest PRODESEN (2017) the demand growth averages 2.9%, down from 3.4% in the 2016 PRODESEN and 3.6% in the 2015 PRODESEN. Historically, on average from 2005 to 2015, the demand growth has been 2.8%. Despite the reduction in forecasted demand growth, Mexico represents significant demand growth compared to other markets.


Figure 1. Historical and forecasted annual demand growth rates in each PRODESEN. Source

Historical generation is dominated by natural gas and oil which accounts for nearly 68% of all generation, and will remain a significant source of generation in the future. Based on a recent Mexico Energy Outlook by the International Energy Agency (IEA), Mexico has plenty of domestic fuel production capability but the competitive gas prices in the U.S. has led to rapid growth in gas imports from the U.S. (26% average annual increase in recent years) and account for 40% of demand. Future import growth estimates are supported by numerous pipeline projects between the U.S. and Mexico outlined in the PRODESEN. This increasing linkage between U.S. and Mexico markets suggests Henry Hub will be a good predictor in Mexico natural gas prices. With that said, Mexico is taking steps to attempt to boost domestic production and compete with imports by removing price caps on domestic supply. Forecasting these prices given market changes, such as removing price caps, will be a major challenge and have a significant impact on electric market forecasting.


Figure 2. Historical generation by fuel type from 2014 to 2016. Source

Large changes in fifteen-year forecasted capacity expansions have also occurred since the 2015 PRODESEN. The forecasted amount of combined cycle additions has decreased substantially, while wind and solar have increased. Particularly, solar capacity has dramatically increased most likely driven by record low solar prices. Expectations for hydro power have also fallen in each subsequent planning scenario as recent hydro forecasts have softened.


Figure 3. Total projected capacity additions over the fifteen-year PRODESEN forecast. Source

The Mexican energy market has made tremendous strides in a very short time frame from the enactment of laws in 2013 to long-term auctions spurring substantial investment in clean energy. It will be important to capture the changing natural gas market and renewable energy growth for power market forecasting. This can be accomplished through a combination of scenarios and stochastic simulations using a well vetted simulation tool and database. If you would like to schedule a demonstration of our Mexico database please contact

Filed under: Hydro Power, Mexico Power Market, Renewable Power, Solar PowerTagged with: , , ,

EMFC Addresses Head-on the Tectonic Industry Changes


With record attendance in one of the most iconic tourist destinations in the world, the 20th Annual Electric Market Forecasting Conference (EMFC) took place September 6-8 in Las Vegas, NV. This industry-leading conference assembled top-notch speakers and gave an exclusive networking experience to attendees from start to finish.

The pre-conference day featured in-depth sessions designed to maximize the value of the Aurora software for its users. Advanced sessions included discussions on resource modeling and improving model productivity, recent database enhancements including the disaggregation of U.S. resources, an update on the nodal capability and data, and other model enhancements.

Pic 1

Michael Soni, Economist, Support | EPIS

Before the afternoon Users’ Group meeting started, EPIS announced that it was dropping “xmp” from the name of its flagship product to purely Aurora, and unveiled a fresh logo. Ben Thompson, CEO of EPIS said, “The new logo reflects our core principles of being solid and dependable, of continuously improving speed and performance, and of our commitment to helping our customers be successful well into this more complex future.”

That evening, attendees kicked-off the main conference with a night under the stars at Eldorado Canyon for drinks, a BBQ dinner and a tour of the Techatticup Mine; the oldest, richest and most famous gold mine in Southern Nevada.


Eldorado Canyon, Techatticup Mine

On Thursday, thought leaders from across the industry presented various perspectives on the complex implications that recent industry changes will have on grid operations, future planning and investments. The forum session opened with Arne Olson, a partner with E3 Consulting in San Francisco, discussing California’s proposed legislation SB-100, which aimed to mandate that 100% of California’s energy must be met by renewable sources by 2045, along with the bill’s implications for Western power markets and systems. He pointed out that SB-32, last year’s expansion of earlier legislation, which mandates a 40% reduction in GHG emissions (below the 1990 levels by 2030), is actually more binding than SB-100. He explained the economics of negative prices, why solar output will be increasingly curtailed and posited that CAISO’s famous “duck curve” is becoming more an economic issue vs. the reliability issue it was originally intended to illustrate.

Other Thursday morning presentations included “The Rise of Utility-Scale Storage: past, present, and future” by Cody Hill, energy storage manager for IPP LS Power, who outlined the advances in utility-scale lithium ion batteries, and their expected contributions to reserves as well as energy; Masood Parvania, Ph.D., professor of electrical and computer engineering at the University of Utah, who described recent advances in continuous-time operation and pricing models that more accurately capture and compensate for the fast-ramping capability of demand response (DR) and energy storage device; and Mahesh Morjaria, Ph.D., vice president of PV systems for First Solar who discussed innovations in PV solar module technology, plant capabilities and integration with storage.

Pic 3

Masood Parvania, Ph.D., Director – Utah Smart Energy Lab | The University of Utah

The afternoon proceeded with Mark Cook, general manager of Hoover Dam, who gave a fascinating glimpse into the operations and improvements of one of the most iconic sources of hydro power in the country; and concluded with Lee Alter, senior resource planning analyst and policy expert for Tucson Electric Power, who shared some of the challenges and lessons learned in integrating renewables at a mid-sized utility.

Networking continued Thursday afternoon with a few of the unique opportunities Las Vegas offers. In smaller groups attendees were able to better connect with each other while enjoying one of three options which included a delicious foodie tour, swinging clubs at TopGolf, or solving a mystery at the Mob Museum.

The final day of the conference was devoted to giving Aurora clients the opportunity to see how their peers are using the software to solve complex power market issues. It featured practical discussions on how to model battery storage, ancillary services, the integration of renewables and an analysis of the impact of clean energy policies all while using Aurora.

The conference adjourned and attendees headed out for a special tour of the Hoover Dam which included a comprehensive view of the massive dam and its operations, and highlighted many of the unique features around the site.

pic 4

Hoover Dam, Power Plant Tour

The EMFC is a once-a-year opportunity for industry professionals. The 20th Annual EMFC addressed head-on the tectonic industry changes (occurring and expected) from deep renewable penetration, advances in storage technologies, and greater uncertainty. Join EPIS next year for the 21st Annual EMFC!

For more information on the 2017 speakers, please visit
To obtain a copy of any or all of the presentations from this year’s EMFC, Aurora clients can go to EPIS’s Knowledge Base website using their login credentials here. If you do not have login credentials, please email to request copies.

Filed under: Events, UncategorizedTagged with: , , , , ,

Total Solar Eclipse and Its Impact on Solar Power


On August 21, 2017 a rare total solar eclipse will sweep across the United States, starting in western Oregon and passing southeast across the country to South Carolina. During this time, the sun will appear either partially or completely blocked by the moon, depending on your location. The “Great American Total Solar Eclipse” will be the first total solar eclipse to span across the United States since 1918. This event also marks the first time where the U.S. electric grid will be significantly impacted by a solar eclipse.


Figure 1. The path of August’s total solar eclipse. Source.

The eclipse is expected to cause a major dip in solar production for a period of hours on this day, especially on the west coast. California, for example, is expecting to lose about 6,000 MW from the grid due to the lack of sunlight, which California ISO (CAISO) is planning to make up for via natural gas and hydro generation. The Washington Post article goes on to discuss how another challenge for CAISO is ensuring the substitute generators are able to ramp up and down quick enough to handle the changes in solar generation. For instance, as the moon begins to block the sun, solar energy collection is expected to decrease at a rate of 70 MW/minute. Similarly, ramp up rates of around 90 MW/minute are expected once the sunlight begins to come back.

This total solar eclipse will mark the first one to be visible on any part of the contiguous United States since 1979, long before solar power held any share of market generation. It will also be the first solar eclipse of any kind in the United States since May 2012, and solar has grown at record rates since then. Luckily, Europe witnessed a similar total solar eclipse in March 2015 to give us a better context of what to expect. Germany, who alone accounts for ~40% of European solar capacity, saw a drop of solar output from 21.7 GW to 6.2 GW during the eclipse. Reuters also reported that to make up the loss of generation, they looked to gas, coal, nuclear and hydroelectric pumped storage energy, and that overall, Europe experienced a reduction of 17 GW of solar power during the eclipse and did an excellent job of successfully weathering the event through proper planning ahead of time.

Back in the U.S., solar power accounted for 9% of California’s generation in 2016 and the state is home to nearly half of the nation’s total solar capacity.  On August 21, California is expected to lose 50 to 75% of its solar production during the five or so hours. We will then see for the first time how the United States electric grid as a whole will adapt to its first significant dip in solar energy caused by a natural phenomenon.

Filed under: Clean Power Plan, Hydro Power, Renewable Power, Solar PowerTagged with: , , ,

How Good is the EIA at Predicting Henry Hub?


Natural gas power plants are a key component of bulk electrical systems in North America. In the U.S., natural gas power plants made up the largest portion of installed capacity, 42%, as of December 2016 and contributed more to generation than any other source. In Mexico, natural gas power plants supplied 54% of the required electricity in 2015 and are a key component of the capacity additions in development of the national electrical system. Natural gas is also likely to be the primary energy source in the U.S. due to increased regulation on coal units, uncertainty around the future of nuclear generation, and low natural gas prices.

Natural gas prices are a critical driver of electricity prices and a key input variable in electric power models. Due to the large amount of natural gas power plants in North America, and because fuel costs are the largest cost component of a thermal power plant, wholesale electricity prices are tightly coupled with natural gas prices. There is also an important feedback loop, in that natural gas demand, and price, is tightly coupled to the operation of natural gas power plants. Understanding the interplay between gas and power markets, and uncertainties in forecasts, is critical for forecasting either.

The U.S. Energy Information Administration (EIA) provides natural gas price short-term forecasts through the Short-Term Energy Outlook (STEO) and long-term forecasts through the Annual Energy Outlook (AEO). For the purposes of this article, we will focus on the STEO. The STEO is a monthly report with, among other items, a natural gas consumption and price forecast for 13 to 24 months in the future depending on the month published. The model predicts consumption and prices for three sectors (commercial, industrial, and residential) in the nine U.S. census districts. To do this, the model calculates natural gas consumption and supply levels to build an inventory. Prices are derived from a regression equation using the inventory and heating and cooling degree days, and analysts then make adjustments for final prices. Detailed information on each equation and method is provided by EIA Natural Gas Consumption and Prices document.

How good is the EIA at forecasting natural gas prices from a month to a year out?

To evaluate the STEO forecasts of natural gas prices, we downloaded each monthly STEO report from January 2012 to December 2016 to allow for at least a full year of analysis with historical prices. This period was selected because it is representative of the current trend of low natural gas prices (relative to historical). The mean absolute error (MAE) and mean absolute percent error (MAPE) were calculated for each forecasted value. Prices were then evaluated for the first forecast in each year and a subset of forecasts from consecutive months during a price spike. The mean absolute percent error was also evaluated for each report year and across all reports.

For the period analyzed (2012 to 2016, shown in orange below), the wholesale Henry Hub gas price averaged $3.30/mmbtu with a high price of $6.19/mmbtu in early 2014 due to the extreme Northeast weather (i.e., the polar vortex) and a low price of $1.78/mmbtu due to warm weather conditions and large amount of storage late in 2016. This period is representative of relatively low natural gas prices as compared to the previous five-year period with high prices exceeding $10/mmbtu driven by high oil prices and an average of $5.63/mmbtu despite the sharp decline due to the financial crisis in 2008-2009.


Figure 1. Historical Henry Hub natural gas prices. The yellow period denotes the study period used for this analysis. Source: EIA.

We started by looking at the longest-term forecasts (24 months) that are delivered in January of each year, and saw an inability to capture rapid fluctuations in prices in the study period:


Figure 2. Historical Henry Hub gas prices with 24 month forecasts from the January STEO of each year starting in 2012 and ending in 2015 using the base case data. Source: EIA STEO.

The January 2012 forecast missed the sharp reduction in prices from the winter to summer that were driven by high storage volumes. Less volatility occurred over the first part of the January 2013 forecast, however this forecast missed the large increase in prices to over $6/mmbtu which were driven by extreme weather conditions. The January 2014 forecast also missed the weather-driven high price for this period and then was high-biased in the later months of the forecast. The January 2015 forecast was high-biased the entire forecast period and missed the lower prices which were driven by a combination of mild weather and high storage volumes.
The STEO forecast is very sensitive to the initial conditions or starting month’s price. For example, plotting each month’s forecast during the increase from $3.74/mmbtu in November 2013 to $6.19/mmbtu in February of 2014 shows the impact of the rapid change in initial condition (last known price) on the first month forecasted value:


Figure 3. Historical Henry Hub gas prices with forecasted values from the months leading up to the rapid price spike in February 2014.

Presumably the long-term fundamental drivers of the STEO do not change as much as the initial conditions, and thus the longer-term forecast is much less sensitive to initial conditions.
Despite missing the fluctuation events, on average across the years analyzed the STEO is within 8% of the price in the first month of the forecast, 25% of the price out to eight months and 33% of the price out to 13 months:


Figure 4. Mean absolute percent error calculated for each forecasted month of STEO reports. Data are averaged over a report year, as well as over all of the report years. Maximum and minimum percent error is calculated over all STEO reports.

On average, the trend has increasing error with forecast length, however, this does not occur in the 13-month 2012 or 2013 STEOs. The expected error growth with time does appear in the 2014 and 2015 STEOs, reaching nearly 60% in the 2014 STEO. The maximum percent error in any given forecast grows rapidly from 26% in the first forecasted month to 75% in the fourth forecasted month, and reaches a high of over 100% 12 and 13 months out.
In absolute terms, the error ranges on average from $0.25/mmbtu in the first forecasted month to $0.88/mmbtu 13 months out. Maximum and minimum errors range from less than a penny up to $2.45/mmbtu.


Figure 5. Mean absolute error calculated for each forecasted month of STEO reports. Data are averaged over a report year, as well as over all of the report years. Maximum and minimum absolute error is calculated over all STEO reports.

Is the STEO forecast good enough? Unfortunately, as with many answers, it depends. More importantly, however, is understanding the limitations and uncertainties in their gas forecasts. If relying on EIA forecasts, you must realize the sensitivity to initial conditions and the typical error growth in the first months to year of the forecast. With this information, sensitivity studies can be formulated to capture possible fluctuations in gas prices. Taken together with other uncertainties such as demand, transmission outages, and plant outages, you can begin to form an ensemble of forecasts.

Filed under: Natural Gas, UncategorizedTagged with: , , ,

EPIS Releases New Version of AURORA


Version 12.3 introduces significant enhancements

Las Vegas, Nevada – April 25, 2017 — EPIS (, the market leader in power market
simulation, forecasting and analysis, announced the release of version 12.3 of its AURORA
software at the Platts Global Power Markets™ Conference. The latest version boasts a number
of enhancements to storage logic, ancillary services, long-term logic, improved RPS modeling
and nodal capabilities.

AURORA 12.3 further solidifies its position as the most valuable power market forecasting
and analysis software on the market today. It is fast, easy to use, and transparent. Upgrades in
the new version include:

  • Enhanced Storage Logic—improved ability to model the intricacies of renewable and
    storage integration, electric vehicles, and other technologies.
  • Ancillary Services Enhancements—significant enhancements, including sub-hourly
    dispatch and use in nodal studies, and improved MW reporting for simultaneous
    contributions to multiple products
  • Improved RPS Modeling—offers new option to identify resources not eligible to set
    capacity prices– especially useful when modeling RPS policies where renewable
    resources must be built but cannot participate in capacity markets. Also, RPS constraints
    can now be input as a percentage of demand or MWh value, giving more flexibility to
    specifying RPS targets over time.
  • Long-Term (LT) Capacity Expansion Logic Enhancements—now have the option to
    change dispatch-hour sampling dynamically—accelerating studies, but still providing
    detail on final production run.
  • New LT Constraint Types—including capacity and energy max limits, which provide
    more flexibility for build decisions to targets in LT studies.
  • New LT Reporting Option—new build report output table making it helpful to quickly
    see which constraints were binding (min, max by technology/fuel/area).
  • Nodal SCUC—version 12.3 also includes an exciting new option to run a full security
    constrained unit commitment (SCUC). The mixed-integer program that performs the
    commitment decisions, now accounts for nodal constraints, including branch, corridor,
    and contingency constraints. The new SCUC ability is in addition to a new, proprietary
    solving method that significantly speeds nodal analysis.

AURORA v.12.3 is further enhanced by the proven and calibrated databases that either
come with the license or as an add-on, including: U.S.-Canada, Europe or Mexico. The calibrated
datasets simplify meaningful forecasting. All AURORA databases include a base-case 25-year
power price forecast and generator capacity expansion and retirement plan. The sources and
procedures used to update the data are thoroughly documented. Updates to the databases are
provided under the annual AURORA license.

For the past 20 years, AURORA has had a reputation for being best-in-class, with unmatched
support. Version 12.3 further establishes its position as the leader in power market forecasting
and analysis.

Filed under: EventsTagged with: , , , , , , ,

Data: Timing is Everything


Staying ahead of the curve by staying on top of industry data

Keeping data current, and applicable to your modeling needs, is not a simple task. It is a known theme within the power industry to expect that as soon as you input data, there will be a need for another update. Much of this has to do with today’s markets being much more transparent than in previous years and more data being available than ever before. Deregulation has played a large role in this transformation; with its need for open markets and transparent pricing came the introduction of a slew of new market products.

Fifty years ago when deregulation began, what are now fundamental market drivers (e.g. sub-hourly, capacity auctions, demand response, energy efficiency, etc.) were unheard of at that time.  The rise of various market data available can be attributed in part to deregulation, or simply to the evolution of technology and the markets. Couple this with the increase in computing speed, server technology advancements, and society’s current “instant gratification” attitude, and you have an industry that demands the right data right now. The growth of available data inputs has led to the need for checks and balances and transparency to the underlying fundamentals. There are a lot more moving parts in today’s power industry which culminates to where we are today: professionals with an enormous amount of data to keep up with and incorporate into simulation models.

In an effort to help integrate posted data in a timely fashion, EPIS has summarized some of the major release dates for data across the U.S., that when considered as a whole, can help your annual planning. The data releases below are grouped by subject type and then further color coded by region. Depending on your modeling needs (large region, day-ahead, capacity expansion, nodal, etc.) you will care about different data releases. However, making sure the data is available when you need it is a significant part of the process that applies across all modeling endeavors.



Figure 1: Some of the key market data releases and the time frame they are typically available

An Excel version of this information is also available for download from our website.  When you filter by region you can see a clearer picture of data availability and start to form regional timelines for your own updates based on the available data.

In today’s transparent power markets, staying current can be a difficult task. Knowing when the data is available is an important first step to planning your update schedules in order to most effectively forecast power markets.

Filed under: Data ManagementTagged with: , ,

Top 10 Pieces of Advice from AURORAxmp Support Experts


Recently, Power Market Insights asked EPIS’s support team to share their top tips for making the most of AURORAxmp. EPIS is known for having best-in-class support and the experts on the team had some very useful advice to share.

Using Both AURORAxmp’s Help Feature and the Website’s Knowledge Base

  1. Take advantage of context-sensitive Help. One very useful feature, especially when first learning the model is the context-sensitive Help. You can always learn more about a specific form or column/table in the model by selecting it and pressing F1 on your keyboard. The Help document contains a wealth of information about all aspects of the model and how it works, making it a valuable reference for users from beginners to experts.
  2. Utilize the online Knowledge Base. Although Help is an excellent way to familiarize yourself with the nuts and bolts of AURORAxmp, the online Knowledge Base on the Support website contains a catalog of presentations on topics that can help you learn the model faster. You can find a compilation of presentations that were given at past conferences, like our annual Electric Market Forecasting Conference or our Spring Group Trainings, not only from EPIS employees but other AURORAxmp users as well. Many of the presentations give step-by-step examples on how to set up different inputs in the model. Using the Knowledge Base alongside the Help document is a great way for a user to thoroughly understand specific areas of AURORAxmp.

Working with AURORAxmp Inputs and Outputs

  1. Be judicious with output reporting. Output databases can grow quickly, which can also increase runtimes. Be sure to limit reporting to just the data you need by using the Report column, available in most input tables. By setting the Report Column to TRUE and de-selecting the All Items box in Run Setup > Reporting form for that output table, you can limit output to just the items you are most interested in. Couple this with the Custom Columns feature, where only the columns you need are reported and you’ll have a perfectly tailored output database.
  2. Take advantage of the dbCompare tool. You can compare either Input or Output databases and then save the results to Excel. In both cases, you can keep a permanent record of the differences, without having to review multiple change sets or manually compare outputs.
  3. Avoid errors due to Improper permissions. Check to make sure that your folder permissions are correct, which will save time in the long run. In some IT environments with enhanced user access security, it may be necessary for your IT team to give you additional rights to certain folders on the system. Contact our Support Team to find out which folders need read and write access.
  4. Test changes in small batches. Take a look at the Data Management article in the Help’s Knowledge Base. When planning to make large sets of changes, it is wise to test them in small batches.  Specifically, perform a short AURORAxmp run after each batch to ensure data was entered properly and is flowing through the model as intended. It is simple to set your period/hours to something very short and fast and direct output to a temporary database. This practice alone can save significant time and effort in tracing troublesome input data.

Managing, Saving, Authenticating

  1. Use Tab My Forms. Many people are unaware of the Tab My Forms option which can help organize multiple AURORAxmp windows on their screens. It can be found under Tools > Options > User Preferences. Along the same lines, if you right-click on a tab, you can select Close All But This to help clean-up your screen when you have too many tabs open.
  2. Create an archive of your project. If you think you need to replicate the results in the future, create an archive. They are great for packaging all the file components of your project into a single .zip file and can easily be transferred to colleagues or used to store a project that you may need to revisit in the future. Once an archive is opened, the project contains everything you need to replicate the output—the same database, change sets, and project settings.  Once unarchived, you simply have to hit Run to replicate the output. This can come in handy if you are asked to replicate output or verify input parameters and run settings.
  3. Know your SQL Server authentication options. AURORAxmp supports two methods to authenticate with your SQL Server: Windows Active Directory-based and SQL Server-based. Windows Active Directory is typically used when individual users are writing output that doesn’t need to be accessed or modified by other users in the organization. SQL Server authentication is best used when output files are going to be shared by multiple users. In this case, some organizations prefer to use a common, single, SQL Server username for multiple users to share.


  1. Understand which computer hardware is best. Considering new hardware? AURORAxmp runs best on physical hardware with fast RAM, a fast CPU and speedy disks. Low latency RAM and a good memory controller seem to have the greatest impact on runtime, followed closely by a fast CPU. While AURORAxmp will take advantage of threading in a variety of places, for a single case fast, single-threaded CPUs with a high clock speed seem to perform best. The fastest AURORAxmp runtimes have been observed on overlocked physical hardware with low latency RAM.

Of course, the support experts at EPIS can help with any questions or issues you may have. Next time you talk to one of them, be sure to ask them about tips and tricks to maximizing the power of AURORAxmp.

What’s your favorite trick or tip? Share it in the comments section.

Filed under: SupportTagged with: , , , ,

Power Market Insights Finishes Strong in 2016


2017 promises to be an even better year of delivering valuable market insight and expertise

The EPIS blog, Power Market Insights is nearly one year old and in that time has posted editorial with a great deal of practical information. The articles, authored by EPIS domain experts, were all carefully researched and delivered valuable intelligence to the industry.

For example, an article on large scale battery storage discussed technology issues and advances that affect the rapidly growing wind and solar market. The article quotes analyst predictions that battery storage costs will drop to $230/kWh by 2020, with an eventual drop to $150/kWh. It goes on to state that worldwide battery storage may grow to almost 14GW by 2023.

Power Market Insights delivered a perspective on the new electric market in Mexico, weeks after that country’s most recent industry reforms were launched. The article reported the fundamental shift in the market and outlined how these reforms would “modernize a constrained and aging system, improve reliability, increase development of renewable generation and drive new investment.” The author discussed the role of zonal resource planning analysis and the importance of data availability. Months later, EPIS announced its Mexico Database for use with AURORAxmp.

Data plays a large role in articles on European power market reporting changes and the EIA easing of data accessibility. Both articles rely on the expertise of EPIS’s Market Research team. The EIA data accessibility article discussed how improvements to the management and delivery of their datasets expand the list of tasks for which EIA data may be useful. For many power modelers, who were unaware of these changes, this information gives important insight that can make their jobs easier. Likewise, the discussion on European power market reporting changes informed readers on ways the available data, while improved, may differ among sources and offered an example of the importance of cross-checking sources.

Two articles lifted the hood to give readers a peek into the workings of algorithms and computing speed. The article on the algorithms at the core of power market modeling offered readers a foundational overview of the mathematical optimizations used in forecasting and analyzing power markets. The computing speed article explained Moore’s Law, discussed how maxed out processors are shifting focus to more cores and how software architecture will soon lose its “free ride.” All of this was put into the perspective of computing data like hourly dispatch and commitment decisions. Both articles enable readers to be able to intelligently discuss the computing parameters that affect their daily performance.

Industry issues were delved into with articles on the water-energy nexus, nuclear retirements, the California market hydropower comeback, uncertainty for ERCOT markets and several articles on the CPP. The writers lent their considerable expertise for these articles—for example, the author of the articles on the CPP had read the entire 304-page filing in the Federal Register before distilling it down for readers to quickly digest.

A number of articles discussed issued faced by modelers as they work to forecast and analyze the market. Pieces on integrated modeling of natural gas and power, working with data in power modeling, the fundamentals of energy efficiency and demand response and reserve margins offered real-world discussions designed to help AURORAxmp users and other industry professionals do their jobs better.

The blog’s 2017 editorial calendar is being finalized right now and will continue to create high-quality articles designed to be of interest to energy and power market professionals. Look for feature editorials next year written by leading analysts and experts in the industry at large. Put Power Market Insights into your must-read list.

Filed under: Power Market InsightsTagged with: ,