EMFC Delivers Practical and Strategic Insight

The conference will be held at the Atlanta Evergreen Marriott Conference Resort, September 14-16, 2016

http://www.globenewswire.com/news-release/2016/10/26/883166/0/en/EPIS-Releases-Mexico-Database-for-Use-with-AURORAxmp.html
gears

(Photos courtesy of Sahabia Ahmed, Entergy; Cameron Porter, Robin Hood Studios; and EPIS employees.)

 

EPIS headed south to Georgia’s beautiful Stone Mountain as it hosted the premier Electric Market Forecasting Conference (EMFC) for the 19th consecutive year, which took place Sept. 14-16. The EMFC featured a stellar lineup of speakers and activities to facilitate Expanding Perspectives on the Future of Energy Markets and provide a unique networking opportunity for over 75 industry experts and professionals in attendance.

The conference kicked off with a fun and relaxing evening at Stone Mountain Park’s Memorial Hall with an impressive view of the mountain and a one-of-a-kind Mountainvision® laser show, inclusive of fireworks and musical scores.

Thursday’s speakers focused on industry-wide issues opening with Jeff Burleson, vice president, system planning of Southern Company, who said that utilities couldn’t ignore what happens on the customer side of the meter. Burleson went on to state that past planning has focused on wholesale generation and transmission, but going forward, utilities will need to consider how customers are shaping and changing their load with new technologies.

Other Thursday morning presentations included “Outlook on Opportunities in Renewable Development” from Mark Herrmann, vice president, structuring of NRG Energy; “Market Evolution for Renewable Integration” from Todd Levin, Ph.D., energy systems engineer of Argonne National Laboratory; and “Advances and Opportunities for Internal Combustion Engine Power Plants” from Joe Ferrari, market development analyst of Wärtsilä North America.

The afternoon proceeded with Lakshmi Alagappan, director, and Jack Moore, director, market analysis, of Energy and Environmental Economics (E3), who presented “California Clean Energy Policy: Implications for Western Markets”. In the session, Alagappan stated that as California’s aggressive RPS comes to fruition, the EIM Market may help alleviate some over-generation by reducing thermal dispatch across the West Interconnection to make room for cheap exports to flow out of California. Alagappan went on to say that over-generation is not an abstract concern. Already, roughly 10 percent of dispatch hours in CAISO this year have resulted in zero or negative prices.

Following Alagappan and Moore, Larry Kellerman, managing partner of Twenty First Century Utilities wrapped up Thursday’s session by proposing a new paradigm for utilities that would allow these organizations to take advantage of low cost of capital and play a role in developments on the customer side of the meter. Strategies included personalized rate structures and curated services and technologies. Kellerman said, “We talk about energy efficiency as a resource, but energy efficiency is only a resource when you can deploy the capital and make the investment.”

Networking continued outside the conference room during Thursday’s afternoon activities. Some attendees took in the scenic views of Stone Mountain on a championship golf course while others participated in a breezy cruise on beautiful Stone Mountain Lake in a 1940’s era Army DUKW followed by a guided tour, highlighting early Georgia life, through Stone Mountain’s Historic Square.

During Friday’s Electric Market Forum speakers, and expert users of AURORAxmp, showcased effective examples of how to enhance your modeling endeavors.  The morning began with Morris Greenberg, managing director, gas and power modeling of PIRA Energy Group. Greenberg, focusing on “Integrating Natural Gas and Power Modeling”, said that the electrical sector is one of the most price elastic categories of natural gas demand. Combining gas and electrical models can capture feedback loops between gas and power markets. Greenberg continued by saying that as the electrical market’s share of total gas consumption increases, the behavior of the electrical market will continue to have a larger and larger impact on gas prices.

Switching gears, Eina Ooka, senior structure and pricing analyst of The Energy Authority, gave a very well received presentation on “Discovering Insights from Outputs – Exploratory Visualization and Reporting Through R”. Ooka said, “Interfacing AURORAxmp with other tools, such as R, allows users to quickly and effectively perform detailed analysis by automating almost all stages of the process.” Ooka concluded with a detailed discussion and demonstration on the visualization of data to make complex information easily digestible.

Additional Friday presentations included “Investing in Mexico Gas and Power” from Brett Blankenship, research director Americas primary fuel fundamentals from Wood Mackenzie and “Challenges of Forecasting Reliability Prices – Capacity Price in PJM & ORDC in ERCOT” from Joo Hyun Jin, commercial analysis of E.ON Climate & Renewables North America.

The EMFC is a once-a-year opportunity for industry professionals. Attendees of the 19th Annual EMFC gained new connections and an enriched market perspective.  As one attendee put it, “I really enjoyed the [presentations]… it was great to have exposure to such a wide range of topics from such qualified speakers. Congrats for doing such a great job with conference planning and execution.” Join EPIS next year for the 20th anniversary in Las Vegas!

For more information on this year’s speakers, please visit http://epis.com/events/2016-emfc/speakers.html

To obtain a copy of any or all of the presentations from this year’s EMFC, please go to EPIS’s Knowledge Base using your login credentials here. If you do not have login credentials, please email info@epis.com to request copies.

Filed under: EventsTagged with: , ,

New Developments in Computing Speed

Moore’s Law Explained

Since the beginning of modern computing early last century, processing speed and power has grown at an amazing rate.  Computer scientist and co-founder of Intel, Gordon Moore, predicted that the number of transistors in computer processors would double every two years.  Over the last half-century this hypothesis, known as Moore’s Law, has proven remarkably accurate.  Due to continuous innovation in the industry, clock speeds in computer chips have improved at a dramatic rate since the early 1970s—if airplane travel times had improved at the same rate over the same time we would be able to get anywhere in the world in a matter of seconds (and it would cost pennies).

One of the great advantages of this improvement in CPU performance over the years has been the fact that every piece of software benefits automatically from a higher processor clock speed.  A computer with a faster clock speed can run the exact same program more quickly with no code changes to the software required.

Maxed Out Processors Shift Focus to More Cores

But the story of computing power has started to change.  Over the last decade, the clock speed of computer processors has begun to top out.  Starting in 2003, processor developers like Intel and AMD started moving away from efforts to continue pushing clock speeds higher and shifted efforts towards increasing the number of processor cores in their chips.

The following graph shows the relationship between transistor density (in red), processor clock speed (or frequency, in green) and the number of processor cores (in black) over time.  The slowing of clock speed increases is clearly visible, as well as the shift toward adding more cores to the processors that have been produced over the last ten years.

cpu_dev

Figure 1: Computing speed developments.  Source

Software Architecture’s Free Ride Ending

These additional cores allow modern processors to perform more tasks simultaneously.  Today’s consumer PCs generally have processors that contain between two to eight cores, while some server processors have as many as 22 cores on a single chip.  However, unlike a clock speed boost, the performance improvements that come with multiple processor cores don’t come for free.  Software has to be significantly re-architected to take advantage of all those cores.  In order for software to run on more than one core at a time, it must be broken down into tasks that can be run simultaneously on the various cores that are available.  This can only be done when a particular task doesn’t require the result of a previous task as input.  Additionally, software must be designed so that resources, such as databases and hardware resources, can be properly accessed by multiple tasks that are running at the same time.

This has specific application to power market modeling software, such as AURORAxmp, that simulates the commitment and dispatch of power plants.  Suppose, for example, that we want to model one full year of 8760 dispatch hours using multiple processors, and assume that we know the hourly load, generator availability, fuel prices, transmission capability, etc. for every hour.  If we had more than 12 available cores to work with, we might break up the run into 12 simultaneous simulations that each run one month of the year.  We could even get all output data results in one database that allows concurrent access such as SQL Server, and the total time to run the 12 months would approach 1/12th the time required to run the full year on one core (though in reality it would not be quite that good because of the overhead managing all the cores).

So what’s the problem?  The hourly dispatch and commitment decisions in the different months are not independent.  Because of constraints that tie one hour’s solution to the next—such as generator minimum up and minimum down times, ramp rates, storage contents, hydro reservoir levels, annual emission limits, etc.—the simulation needs to know what happened in the previous hours to properly model the current hour.  The simplifying assumption that the operations of the power plants in each month are independent might be acceptable in some types of scenarios, but for a precise solution we simply can’t solve one hour until we know the solution from the previous hours.

Utilizing Multicore Advancement

But that doesn’t mean that there aren’t still great gains to be had in power market modeling software with multicore processors.  Certainly there is much processing of input and output data into this type of model that, if built properly, can take advantage of multiple processors.

For example, the standard DC power flow approximation using shift factors (multipliers used to calculate and constrain power flows) can require an enormous amount of computation.  A large system such as the Eastern Interconnect may well have over one billion non-zero factors that must be used in each hour’s simulation to calculate power flow between buses.  Intelligently using multiple processors to calculate those flows can drastically reduce the run time of these types of simulations.

Another place where utilizing multiple cores will help in this kind of software is in the mathematical solvers that perform the core optimizations.  Those solvers (such as Gurobi, CPLEX, and MOSEK) continue to improve their internal use of threading in their LP (linear programming) and MIP (mixed-integer programming) algorithms.  As they continue to get better at exploiting multiple processors, the power market models that use them will be significant beneficiaries.

We don’t know for sure what the next decade of computer processor improvements will bring.  We can undoubtedly expect some single processor speed improvements, but to keep the 2x trend of Moore’s Law going, it will almost certainly take a major effort on the part of software developers to utilize the new threading paradigm.  The capability of power market models to continue to tackle the most complex optimization problems with reasonable solution times may very well depend on their ability to embrace our new environment of multiprocessor architectures.

Filed under: Computing SpeedTagged with: , , ,

European Power Market Reporting Changes

Data Transparency Doesn’t Always Mean Ease of Use

The ENTSO-E Transparency Platform has increased the amount of European power market data publicly available in recent years.  While not completely comprehensive, it does help consolidate a vast amount of information in a single location.  ENTSO-E (European Network of Transmission System Operator for Electricity) was established in 2009 for the purpose of “further liberalising the gas and electricity markets in the EU.”  ENTSO-E represents 42 TSOs from 35 countries, including EU countries and non-EU countries like Iceland, Norway and Turkey, among others.

Diverse Levels of Compliance

Unfortunately, the various TSO’s have diverse levels of compliance in reporting data completely, or in some cases on a regular basis, as they follow their own time schedule and level of detail.  Some appear to only report units with installed capacity above 10 MW, while others also report smaller units.  ENTSO-E provides data in two different levels of detail: by unit and by country.  The by-country values are totals for the entire country for units above 1 MW.  By unit, ENTOS-E only asks that its members report details on units above 100 MW; but the actual minimum size for unit detail reported varies by country, as well as the fuel type.  Some countries identify the fuel explicitly, while others simply identify units as thermal, which might be coal, natural gas, fuel oil, or a combination of fuels.  When comparing old data sources to each TSOs publicly-released data, a complete and exact unit-by-unit match with ENTSO-E reported data nearly impossible.

Reviewing ENTSO-E Data by Country

For example, EPIS recently performed an update to resources in Italy.  While gathering data from ENTSO-E at the country level, we found this year over year comparison provided by ENTSO-E.

fig1

Figure 1: ENTSO-E: installed capacity by fuel type, by country Source

Note that the 2014 total of 102,547 MW is only a five percent variance to the 2015 total of 97,794 MW. But the interesting values in this report are the variances reported in the different fuel categories.  For instance, there are a number of Production Types that are relatively close year-over-year, but notice that the “Other” category in 2014 was ~37k MW, while 2015 was ~14k MW, resulting in a 63% decrease for that fuel type.  Another set of values also should jump out at the casual observer: “Fossil Hard coal” increased from 1,360 MW to 6,386 MW.  Was Italy introducing new coal units?  No. They were simply modifying their reported fuel type to be more in line with ENTOS-E reporting policies.

Differences in ENTSO-E Data by Unit

Next we reviewed the ENTSO-E data by unit, which is required above 100 MW.

fig2

Figure 2: ENTSO-E 2015: installed capacity by fuel type, by unit Source

In this analysis, the item that is most unique is that while the data is now at a finer granularity of detail (i.e. by unit), the “Other” category has now grown larger, to ~43k MW, than the reported values by country of ~14k MW and ~37k MW in 2015 and 2014 respectively.

In other words, their own by unit data is not matching their reported country level totals. What is going on here?  Primarily, when researched further, we found a large number of units, that can rely on multiple fuels, are categorized as “Other” in the by-unit report.  When we then condensed the Production Type detail a little further and compared 2014 and 2015 by country to the 2015 by unit data, we found this:

fig3

Figure 3: ENTSO-E: capacity differences reported by country or by unit

After reviewing these summaries, we saw that the renewable fuels are fairly close when comparing by-unit to by-country totals: wind is comparable, GST is also very close, but solar does not compare well since many units are under 1 MW and not included in the by unit report.  This comparison also showed that the totals of thermal and “Other” fuels together are fairly similar and make up over 60% of the total installed capacity in each report.

Moving Forward & Cross-checking

So where to go from here in making sense of reporting variability?  ENTSO-E is currently compiling data submitted by TERNA, the TSO in Italy, and we took a look at what data is available in that report.

fig4

Figure 4: TERNA 2015: installed capacity by fuel Source

Two things to note here are that the TERNA resource database only reports units 100 MW and larger, and they have an even smaller set of Production Type groups.  Again we noticed the total capacity reported by unit is very different at ~73k MW versus the ~93k MW from the previous report, but explainable due to renewable sources generally having smaller installed capacity values and therefore not included in this report.  Of note, no solar is reported here, only two wind units that total 243 MW are included, and the reported hydro is approximately 60% of the total MW reported to ENTSO-E.  However, the thermo electric total matches fairly well with the ENTSO-E data at ~60k MW.

So, what have we seen in reviewing these 3 sets of data from these two sources?  ENTSO-E and TERNA have come a long way in providing transparency with their data, but as the details here show, there is still a long way to go before the data can be easily adopted without a lot of scrubbing.

Filed under: European Power Market, Power Market InsightsTagged with: , ,