Data: Timing is Everything

Staying ahead of the curve by staying on top of industry data

Keeping data current, and applicable to your modeling needs, is not a simple task. It is a known theme within the power industry to expect that as soon as you input data, there will be a need for another update. Much of this has to do with today’s markets being much more transparent than in previous years and more data being available than ever before. Deregulation has played a large role in this transformation; with its need for open markets and transparent pricing came the introduction of a slew of new market products.

Fifty years ago when deregulation began, what are now fundamental market drivers (e.g. sub-hourly, capacity auctions, demand response, energy efficiency, etc.) were unheard of at that time.  The rise of various market data available can be attributed in part to deregulation, or simply to the evolution of technology and the markets. Couple this with the increase in computing speed, server technology advancements, and society’s current “instant gratification” attitude, and you have an industry that demands the right data right now. The growth of available data inputs has led to the need for checks and balances and transparency to the underlying fundamentals. There are a lot more moving parts in today’s power industry which culminates to where we are today: professionals with an enormous amount of data to keep up with and incorporate into simulation models.

In an effort to help integrate posted data in a timely fashion, EPIS has summarized some of the major release dates for data across the U.S., that when considered as a whole, can help your annual planning. The data releases below are grouped by subject type and then further color coded by region. Depending on your modeling needs (large region, day-ahead, capacity expansion, nodal, etc.) you will care about different data releases. However, making sure the data is available when you need it is a significant part of the process that applies across all modeling endeavors.

 

data_spreadsheet

Figure 1: Some of the key market data releases and the time frame they are typically available

An Excel version of this information is also available for download from our website.  When you filter by region you can see a clearer picture of data availability and start to form regional timelines for your own updates based on the available data.

In today’s transparent power markets, staying current can be a difficult task. Knowing when the data is available is an important first step to planning your update schedules in order to most effectively forecast power markets.

Filed under: Data ManagementTagged with: , ,

EPIS Releases Mexico Database for Use with AURORAxmp

Database will provide power market simulation, forecasting and analysis for Mexico and borders

Salt Lake City, Utah – October 26, 2016

http://www.globenewswire.com/news-release/2016/10/26/883166/0/en/EPIS-Releases-Mexico-Database-for-Use-with-AURORAxmp.html

EPIS, the market leader in power market simulation, forecasting and analysis, has released the Mexico Wholesale Market (Mercado Eléctrico Mayorista – MEM) database.  The database will be offered as an upgrade or add-in to its industry-leading AURORAxmp software.

Users of the AURORAxmp software, which is known for delivering unparalleled forecasting and analytical productivity, ease of use and support, will now have access to high quality MEM data, pulled from trusted sources. The AURORAxmp MEM database will be regularly updated to reflect the most recent PRODESEN assumptions from SENER and other key sources including: CENACE data, and analyst experience with CFE and other IPPs in Mexico.

“Recent and ongoing energy market reforms in Mexico, coupled with growth expectations, are creating significant investment opportunities in electric power generation and transmission infrastructure. The most recent PRODESEN (2016-2030) report estimates approximately $90B (USD) in generation investment opportunities and $25B (USD) in transmission and distribution investment opportunities,” said Ben Thompson, CEO of EPIS. “Our MEM database allows users of AURORAxmp to forecast and do market simulations, taking into account this important market.”

It is critical that data sources represent the current state of the National Electricity System and its expected evolution over the next 15 or 20 years. These sources need to be updated regularly, scrubbed to fill in gaps and reflect operational realties, and are tested and calibrated in models so it is trustworthy and commercially reliable. The MEM database offers this needed level of quality.

The AURORAxmp MEM database is formatted, tested, and immediately ready to use for high-quality valuations, market analysis (including energy and capacity), as well as congestion and risk analysis of Mexican power markets. It offers cross-border analysis with boundary zones, including Belize, Guatemala, ERCOT (TX), WECC (AZ) and WECC (CAISO).

The AURORAxmp MEM Database includes primary Mexican power grids, including:

  • Sistema Interconectado Nacional (SIN)
  • Baja California (BCA)
  • Baja California Sur (BCS)

The systems are fully represented by 53 zones that align with PRODESEN and include “proxies” for transmission with boundary zones like Belize, Guatemala, ERCOT (TX), WECC (AZ) and WECC (CAISO).

Our product contains the best available data, refined to represent the current system’s operational realities and market including:

  • Gas constraints
  • Hydro conditions
  • Policy initiatives, including clean energy goals
  • Well-documented sources

Highlights include:

  • Generation: Approximately 800 operational generators, with another 150 in advanced development (construction or LT auction winners), including supporting hourly wind and solar profiles for each zone
  • Fuel prices, including Mexico natural gas hubs Mexico diesel prices (driven to an extent by U.S. imports), Houston Ship Channel, Henry Hub, South Texas, Waha, SoCal Border and distillate/residual fuel oil (FO2/FO6), coal and diesel from U.S. EIA, adjusted for Mexican transport costs
  • Transmission: inter-zonal transfer limits (links) and underlying physical lines, with resistance values, from which loss assumptions can be derived

As with any AURORAxmp database, users can expect the highest level of software integration, model control and easy data exchange. Users can easily import and overlay their own assumptions and other data sources for more powerful, customized insights.

About EPIS

EPIS, LLC (www.epis.com) is the developer of AURORAxmp, the leading-edge software for forecasting wholesale power market prices. The company also provides ready-to-use data for North America and Europe, and unrivaled customer support to its growing body of customers worldwide. A variety of organizations-including utilities (large and small), independent power producers (IPPs), developers, traders, energy consultants, regulatory agencies and universities-use AURORAxmp to model power system dispatch and the formation of both nodal and zonal wholesale power prices, and to perform a wide range of associated analytics over the short- and long-term. AURORAxmp is a comprehensive solution to power market modeling needs. Offices are located in Salt Lake City, UT, Tigard, OR and Sandpoint, ID.

Filed under: Data Management, Mexico Power MarketTagged with: , , ,

EIA Eases Data Accessibility for Power Modelers

The U.S. Energy Information Administration (EIA) has long been a key source for electrical market data. In the past, much of the EIA’s data have been useful for long-term planning, but have suffered from long lag times and cumbersome manual downloads. Some data have not been published until months or even years after the time period they describe. For example, a generator which began operating in May of 2012 might not have appeared in the EIA’s primary resource list (the EIA-860) until October or November of 2013. Historically, these issues have limited the usefulness of EIA data for many modeling purposes.

However, over the last 2 years, the EIA has made several improvements to the management and delivery of their datasets which some longtime modelers may not be aware of. These enhancements include the EIA-860M, the new Excel Add-in, and the U.S. Electric System Operating Data application. Together, these enhancements greatly expand the list of tasks for which EIA data may be useful.

Form 860M

The EIA-860 is a comprehensive list of grid-connected generators in the U.S. with capacity greater than 1 MW. No data set is perfect, but the EIA-860 has characteristics which are attractive to anyone concerned with data quality. EIA-860 data are collected directly from plant owners who are legally required to respond, it is expressed in consistent terms nationwide, and it is vetted by EIA staff prior to release. While thorough and generally accurate, this process is slow and has only been conducted once each year, leading to lag times of 10-22 months.

In July of 2015, the EIA quietly started publishing data from a new monthly survey, the EIA-860M. This survey is sent to plant owners which reported capacity coming online or retiring in the near future as reported in the most recent EIA-860. The EIA-860M keeps track of these expected changes, and gives plant owners a chance to update the EIA on their progress mid-year. Much of this information has previously been available through the Electric Power Monthly reports, but the EIA-860M combines these data with similar information from the full EIA-860 to create a comprehensive list of active generators. Here are a few things to keep in mind when working with the EIA-860M:

  • It includes a smaller set of unit characteristics than the full EIA-860
  • It has a lag of 2-3 month, so responses for May are posted late-July
  • Like the EIA-860, the Retired list for the EIA-860M is not comprehensive. Only entities with operating plants are required to file with the EIA. So, if a company shuts down its last plant, it no longer responds to the EIA-860 or EIA-860M surveys, and its retired plants will not show up in the Retired list
  • Unlike the EIA-860, the EIA-860M is not vetted prior to release. In order to maintain a timely publishing schedule, the EIA-860M is posted “as-is” and is subject to update without notification

Despite these limitations, the EIA-860M is a relatively thorough and current census of existing and planned generating capacity in the US. It is a welcome addition to the EIA’s current offerings.

Electric System Operating Data

The EIA has taken their first step into the world of intra-day reporting with the new U.S. Electric System Operating Data viewer. While the tool is still in Open Beta, and comes with a fair number of known issues, it promises to be an excellent source for very near-term information about the bulk electrical grid of the U.S.

nyis1

Figure 1: EIA Operating Data – Status Map

Since July of 2015, the EIA has been collecting hourly data from all 66 Balancing Authorities operating in the U.S., including:

  • Day-ahead demand forecasts
  • Actual demand
  • Net generation
  • Interchange with surrounding Balancing Authorities

When everything is working smoothly, the EIA posts these data with a lag of only 80 minutes! These same data are available for download in table form and include API codes for pulling them directly into an Excel workbook using the add-in described below. The EIA also includes a series of pre-made charts and reports on daily supply-demand balance, discrepancies between forecast and actual demand, and much more.

Even for long-term planners, the new datasets collected by the EIA will likely be useful. Never before has the EIA published such granular demand and interchange data. The interchange data in particular has historically been very difficult to find from a publicly available source. Also, Balancing Authorities are much more useful footprints for modeling purposes than states, which is how the EIA partitions much of their information currently. Although it is still in its infancy, the Electric System Operating Data tool promises to open many avenues of analysis which were previously infeasible.

Excel Add-in

Released in February of 2015, the EIA Excel Add-in is useful for importing frequently updated data series into an existing process. While the EIA Interactive Table Viewer is handy for browsing and pulling individual data series, the data almost always need some sort of manipulation or conversion before being input into production cost models such as AURORAxmp. Whether you are converting between nominal and real dollars, changing units, extrapolating growth rates, or combining EIA data with other sources, a series of computations are usually required between raw data and useful inputs. The new Excel add-in allows a user to construct an Excel workbook with all the necessary conversions which can be updated to the latest EIA data with a single click.

ribbon

Figure 2: EIA Excel Add-in Ribbon

Economic data series from the St. Louis Federal Reserve are also available through the same add-in, allowing the user to pull in indicators such as inflation or exchange rates alongside energy-specific data from the EIA. Not only does this save time, it ensures that the correct data series is queried each time the data are updated.

The EIA has always been a key data source for energy analysts, and they are rapidly evolving to become even better. Staying up to date with their latest offerings can reveal relatively easy solutions for some of the toughest data management and upkeep issues encountered by power system modelers.

Filed under: Data ManagementTagged with: , , , , ,

The Algorithms at the Core of Power Market Modeling

In 2007, the U.S. government formed the Advanced Research Projects Agency-Energy (ARPA-E) which encourages research on emerging energy technologies. Last year this agency awarded about 3.1 million dollars to the Pacific Northwest National Laboratory (PNNL) to work on a computational tool called High-Performance Power-Grid Operation (HIPPO) over the next few years. The research team will be led by an applied mathematician at PNNL and be partnered with GE’s Grid Solutions, MISO, and Gurobi Optimization. The group will seek improved ways to solve the unit commitment problem, “one of the most challenging computational problems in the power industry.” The work highlights the general trend over the past twenty years in this and other industries to turn to mathematical optimization for answers to some of the most difficult scheduling and planning problems. What’s astounding is the rate at which commercial mathematical solvers have been able to respond to these needs with enormous leaps in algorithmic efficiency over a relatively short period of time.

At the core of most of the mathematical optimization used in power modeling is linear programming (LP). Linear programs are problems in which some linear function is maximized or minimized given a set of linear constraints. The mathematician George Dantzig invented the simplex algorithm in 1947 in advance of the day when computers could really take advantage of it. For example, in 1953 one implementation of the algorithm on a Card Programmable Calculator (CPC) could solve a certain 26 constraint, 71 variable instance of the classic Stigler Diet Problem in about eight hours. As computer technology advanced, though, the usefulness and power of the simplex algorithm specifically and linear programming in general became apparent. Advances in the algorithm combined with exponential computer speed improvements made linear programming a staple in problem solving by the early 2000s. In fact, algorithmic progress in linear programming (i.e. independent from computer speed improvements) gave a 3300x improvement factor from 1988 to 2004. Coupled with actual computer machine improvements of 1600x in that same time horizon, this produced a 5,280,000x average improvement for solving linear programs!

While progress on linear programs has somewhat plateaued in recent years, improvements in mixed-integer programming (MIP) have continued at impressive rates. In its simplest form, a mixed-integer program is a linear program for which some of the variables are restricted to integer values. This integer-value restriction makes the problem so difficult that it is NP-hard, meaning that finding a guaranteed polynomial time algorithm for all MIPs will most likely never occur. And yet the MIP is at the center of an ever-increasing number of practical problems like the unit commitment problem that the HIPPO tool mentioned above is meant to address, and it is only relatively recently that it really became a practical problem solving tool. According to one expert and active participant in the field, Robert Bixby,

“In 1998 there was a fundamental change in our ability to solve real-world MIPs. With these developments it was possible, arguably for the first time, to use an out-of-the box solver together with default settings to solve a significant fraction of non-trivial, real-world MIP instances.”

He provided this chart showing the improvements in one MIP solver, CPLEX, from 1991 to 2007:

Cplex version-to-version pairs
Figure 1. CPLEX Version-to-Version Pairs. Source

This chart shows that over approximately 16 years, the machine-independent speed improvement was roughly 29,000x! The progress on developing fast algorithms to solve (or at least find good solutions to) mixed-integer programs has been simply explosive.

The importance of this development is highlighted by extensive use of MIPs by regional reliability organizations in the United States. An independent review published by the National Academies Press states that:

In the day-ahead time frame, the CAISO, ERCOT, ISO-NE, MISO, PJM, and SPP markets employ a day-ahead reliability unit commitment process… The optimization for the day-ahead market uses a dc power flow and a mixed integer program for optimization.

In other words, the MIP is at the core of day-ahead market modeling for these major reliability organizations. A presentation given a few years back by PJM shows their increasing need to solve very difficult MIPs in a shorter time frame. The presentation highlights the fact that PJM has a “major computational need” for “better, faster MIP algorithms and software.” The short slide deck states three times in different contexts the need in PJM for “even faster dynamic MIP algorithms.” The entity must solve their day-ahead model for the security constrained unit commitment (SCUC) problem in a four-hour window towards the end of each day, and the presentation explains that they “have a hard time solving deterministic SCUC models in the time allotted.” So the need for ever-improving mixed-integer programs in the energy industry doesn’t seem to be going away any time soon. And with the increasing complexity of problems such as renewable integration, sub-hourly modeling, and the handling of stochastics, the push for “better, faster MIP algorithms” will only continue.

So what does all of this mean for power modelers? Professional solvers’ ability to continue to improve LP/MIP algorithms’ performance will determine whether the most difficult questions can still be addressed and modeled. But, in addition to that, it is crucial that the simulation models that seek to mimic real-world operations with those solvers are able to intelligently implement the fastest possible optimization codes. As EPIS continues to enhance AURORAxmp, we understand that need and spend an enormous amount of time fine-tuning the LP/MIP implementations and seeking new ways to use the solvers to the greatest advantage. Users of AURORAxmp don’t need to understand those implementation details—everything from how to keep the LP constraint matrix numerically stable to how to pick between the interior point and dual simplex LP algorithms—but they can have confidence that we are committed to keeping on pace with the incredible performance improvements of professional solvers. It is in large part due to that commitment that AURORAxmp has also consistently improved its own simulation run time in significant ways in all major releases of the past three years. With development currently in the works to cut run times in half of the most difficult DC SCOPF simulations, we are confident that this trend will only continue in the coming years with future releases of AURORAxmp. As was said about the projected future development of mixed-integer programs, the performance improvement “shows no signs of stopping.”

 

Filed under: Data Management, Power Market InsightsTagged with: , ,

Living in the Past?

Living in the past is not healthy. Is your database up-to-date? EPIS just launched the latest update to the North American Database, version 2016_v4, marking the fourth North American data update this year! Recent changes in the power industry present challenges to database management which will be discussed in this post.

In general, the transformation in power generation sources in the U.S. coupled with evolving electricity demand and grid management represents a paradigm shift in the power sector. In order to accurately model power prices in the midst of such change, one must have a model built on fundamentals and a database that is up-to-date, has reasonable assumptions, is transparent and is flexible. A recent post described the technical side of working with databases in power modeling. This entry outlines important changes in the East Interconnect, the impacts those changes have on data assumptions and configuration and the steps we are taking to provide excellent databases to our clients.

Recent shifts in power generation sources challenge database assumptions and management. New plant construction and generation in the U.S. are heavily weighted towards renewables, mostly wind and solar and as a result, record generation from renewables has been reported across the East Interconnect. Specifically, on April 6, 2016, the Southwest Power Pool (SPP) set the record for wind penetration:

Record Wind Penetration Levels 2015

Figure 1. Record wind penetration levels in Eastern ISOs compared with average penetration in 2015. SPP holds the record which was reported on April 6, 2016. Record sources: NYISO, SPP, MISO, ISO-NE, PJM. 2015 Averages compiled from ISO reports, for example: NYISO, SPP, MISO, ISO-NE, PJM. *Average 2015 generation used to calculate penetration.

Similarly, the New York City area reached a milestone of over 100 MW in installed solar distributed resources. Accompanying the increase in renewables are increases in natural gas generation and reductions in coal generation. In ISO-NE, natural gas production has increased 34 percent and coal has decreased 14 percent since 2000, as highlighted in their 2016 Regional Electricity Outlook. These rapid changes in power generation sources require frequent and rigorous database updates.

Continued electric grid management changes in the East Interconnect also requires flexibility in databases. One recent change in grid management was the Integrated System joining the Southwest Power Pool, resulting in Western Area Power Administration’s Heartland Consumers Power District, Basin Electric Power Cooperative and Upper Great Plains Region joining the RTO. The full operational control changed on October 1, 2015, thus expanding SPPs footprint to 14 states, increasing load by approximately 10 percent and tripling hydro capacity. Grid management change is not new, with the integration of MISO South in 2013 as an example. Changes such as these require flexibility in data configuration that allow for easy restructuring of areas, systems and transmission connections.

Variability in parameters, such as fuel prices and demand, introduce further difficulty in modeling power markets. The so called “Polar Vortex” weather phenomena shocked North Eastern power markets in the winter of 2013/2014 with cold temperatures and high natural gas prices resulting in average January 2014 energy prices exceeding $180/MWh in ISO-NE. It seemed like the polar opposite situation occurred this last winter. December 2015 was the mildest since 1960, and together with low natural gas prices, the average wholesale power price hit a 13-year low at $21/MW. The trend continued into Q1 of 2016:

Monthly average power price in ISO-NE Q1 2014 and 2016

Figure 2. Monthly average power price in ISO-NE in Q1 2014 and 2016. Variability between years is a result of high natural gas prices and cold weather in 2014 versus low natural gas prices and mild weather in 2016.

Whether extreme events, evolving demand or volatile markets, capturing uncertainty in power modeling databases is challenging. In AURORAxmp, users can go one step further by performing risk simulations; specifying parameters such as fuel prices and demand to vary across a range of simulations. This is a very powerful approach to understanding the implications of uncertainty within the input data.

The aforementioned changes in generation, grid management and demand, offer exciting new challenges to test power market models and data assumptions. To test our platform, EPIS performs a historical analysis as a part of each database release. Inputs of historical demand and fuel prices are used to ensure basic drivers are captured and model output is evaluated not only in terms of capacity, but monthly generation, fuel usage and power prices. The result of this process is a default database that is accurate, current, contains reasonable assumptions, is transparent and is flexible to ensure you have the proper starting point for analysis and a springboard for success.

With the release of North_American_DB_2016_v4, EPIS continues to provide clients with superb data for rigorous power modelling. The 2016_v4 update focuses on the East Interconnect and includes updates to demand, fuels, resources, DSM and other miscellaneous items. Clients can login to our support site now to download the database and full release notes. Other interested parties can contact us for more information.

Filed under: Data Management, Power Market InsightsTagged with: , , , , , ,

Working With Data in Power Modeling

How Much Data Are We Talking About?

When planning the deployment of a power modeling and forecasting tool in a corporate environment, one of the most important considerations prior to implementation is the size of the data that will be used. IT personnel want to know how much data they are going to be storing, maintaining, backing up, and archiving so they can plan for the hardware and software resources to handle it. The answer varies widely depending on the types of analysis to be performed. Input databases may be relatively small (e.g. 100 megabytes), or they can be several gigabytes if many assumptions require information to be defined on the hourly or even sub-hourly level. Output databases can be anywhere from a few megabytes to several hundred gigabytes or even terabytes depending on what information needs to be reported and the required granularity of the reports. The data managed and stored by the IT department can quickly add up and become a challenge to maintain.

Here are a couple example scenarios:

A single planning analyst does a one-year hourly run (8760 hours) with modest reporting, which produces an output database of 40 MB. On average, the analyst runs about six studies per day over 50 weeks and the total space generated by this analyst is a modest 75GB. This is totally manageable for an IT department using inexpensive disk space.

Now, let’s say there are five analysts, they need more detailed reporting, they are looking at multiple years, and a regulatory agency states that they have to retain all of their data for 10 years. In this scenario, the total data size jumps to 500 MB for a single study. Given the same six studies per day those analysts would accumulate 3.75 TB of output data in a year, all needing to be backed up and archived for the auditors, which will take a considerable amount of hardware and IT resources.

What Are My Database Options?

There are dozens of database management systems available. Many power modeling tools support just one database system natively, so it’s important to know the data limitations of the different modeling tools when selecting one.

Some database systems are file-based. For example, one popular file-based database system is called SQLite. SQLite is fast, free, and flexible. This file-based database system is very efficient and is fairly easy to work with, but is best suited for individual users, as are many other file-based systems. These systems are great options for a single analyst working on a single machine.

As mentioned earlier, groups of analysts might decide to all share a common input database and write simultaneously to many output databases. Typically, this requires a dedicated server to handle all of the interaction between the forecasting systems and the source or destination databases. Microsoft SQL Server is one of the most popular database systems available in corporate environments, and the technical resources for it are usually available in most companies. Once you have your modeling database saved in SQL Server, assuming your modeling tool supports it, you can read from input databases and write to databases simultaneously and share the data with other departments with tools that they are already familiar with.

Here is a quick comparison of some of the more popular database systems used in power modeling:

Database System DB Size Limit (GB) Supported Hardware Client/Server Cost
MySQL Unlimited 64-bit or 32-bit Yes Free
Oracle Unlimited 64-bit or 32-bit Yes High
MS SQL Server 536,854,528 64-bit Only (as of 2016) Yes High
SQLite 131,072 64-bit or 32-bit No Free
XML / Text File OS File Size Limit 64-bit or 32-bit No Free
MS SQL Server Express 10 64-bit or 32-bit Yes Free
MS Access (JET)* 2 32-bit Only No Low

A Word About MS Access (JET)*

In the past, many Windows desktop applications requiring an inexpensive desktop database system used MS Access database (more formally known as the Microsoft JET Database Engine). As hardware and operating systems have transitioned to 64-bit architectures, the use of MS Access database has become less popular due to some of its limitations (2GB max database size, 32,768 objects, etc.), as well as to increasing alternatives. Microsoft has not produced a 64-bit version of JET and does not have plans to do so. There are several other free desktop database engines available that serve the same needs as JET but run natively on 64-bit systems, including Microsoft SQL Server Express, SQLite, or MySQL which offer many more features.

Which Databases Does AURORAxmp Support?

There are several input and output database options when using AURORAxmp for power modeling. Those options, coupled with some department workflow policies, will go a long way in making sure your data is manageable and organized.

EPIS delivers its native AURORAxmp databases in a SQLite format which we call xmpSQL. No external management tools are required to work with these database files – everything you need is built into AURORAxmp. You can read, write, view, change, query, etc., all within the application. Other users with AURORAxmp can also utilize these database files, but xmpSQL doesn’t really lend itself to a team of users all writing to it at the same time. Additionally, some of our customers have connected departments that would like to use the forecast data outside of the model, and that usually leads them to Microsoft SQL Server.

For groups of analysts collaborating on larger studies, AURORAxmp supports SQL Server database, although its use isn’t required. Rather than use SQL Server as the database standard for AURORAxmp (which might be expensive for some customers), the input databases are delivered in a low cost format (xmpSQL), but AURORAxmp offers the tools to easily change the format. Once the database is saved in SQL Server, you are using one of the most powerful, scalable, accessible database formats on the planet with AURORAxmp. Some of our customers also use the free version of SQL Server – called SQL Server Express Edition – which works the same way as the full version, but has a database size limit of 10GB.

Some additional options for output databases within AURORAxmp are:

MySQL: Open source, free, server-based, simultaneous database platform that is only slightly less popular than SQL Server.
XML/Zipped XML: A simple file-based system that makes it easy to import and export data. Many customers like using this database type because the data is easily accessed and is human readable without additional expensive software.
MS Access (JET) : The 32-bit version of AURORAxmp will read from and write to MS Access databases. EPIS, however, does not recommend using it given the other database options available, and due to its 2 GB size limitation. MS Access was largely designed to be an inexpensive desktop database system and given its limitations as previously discussed, we recommend choosing another option such as xmpSQL, SQL Server Express or MySQL which offer far more features.

Where Do We Go From Here?

AURORAxmp is a fantastic tool for power system modeling and forecasting wholesale power market prices. It has been in the marketplace for over twenty years, and is relied upon by many customers to provide accurate and timely information about the markets they model. However, it really can’t do anything without an input database.

EPIS has a team of market analysts that are dedicated to researching, building, testing, and delivering databases for many national and international power markets. We provide these databases as part of the license for AURORAxmp. We have many customers that use our delivered databases and others who choose to model their own data. Either way, AURORAxmp has the power and the flexibility to utilize input data from many different database types.

If you are just finding AURORAxmp and want to see how all of this works, we have a team here that would love to show you the interface, speed and flexibility of our product. If you are already using our model but would like guidance on which database system is best for your situation, contact our EPIS Support Team and we’ll be glad to discuss it with you.

Filed under: Data Management, Power Market InsightsTagged with: , , ,