Has the one-day-in-10-years criterion outlived its usefulness?
James F. Wilson is an economist and principal of Wilson Energy Economics, and also is an affiliate of LECG, LLC. Email him at jwilson@wilsonenec.com. This article expresses the author’s views and not necessarily those of any client.
Electric utilities and regional transmission organizations (RTOs) in the United States aim to have enough electric generating capacity to meet anticipated peak loads with a reserve margin for reliability. The reserve margins usually are set to meet the widely-accepted “one day in 10 years” (1-in-10) resource adequacy criterion, under which the expected frequency of having to curtail firm load due to inadequate capacity should be no greater than once every 10 years.
The 1-in-10 criterion always has been highly conservative—perhaps an order of magnitude more stringent than the marginal benefits of incremental capacity can justify—and capacity planning has been even more conservative in practice. Indeed, economists have questioned the 1-in-10 criterion for many decades.1
Resource adequacy practices based on the 1-in-10 criterion perhaps make more sense for utility planners and regulatory authorities, who would have to answer for any curtailments that occur, than for the consumers who are directly affected if reliability isn’t maintained, but who also bear the cost of the additional capacity.