Nuclear energy has its origins in the scientific discoveries of the early and mid 20th century. By the end of World War II in 1945 the development of the first nuclear weapons by the United States had laid the foundation for nuclear power.
Currently, nuclear energy is responsible for 19 percent of U.S. electricity generation. By the late 1970’s, however, orders for nuclear power ceased because of skyrocketing costs, accident concerns and unresolved nuclear waste disposal problems.
In the past several years, there has been renewed interest in nuclear energy as a means to mitigate the impacts of global warming due to carbon emissions. An expansion of nuclear power to effectively mitigate greenhouse gas emissions would be prohibitively expensive and risky, requiring at least 1,000 reactors over the next 45 years. It also would be an extremely slow process, taking decades to achieve any reductions in world CO2 emissions, if, indeed, it ever does. This would be a much longer time frame than implementing energy efficiency measures, distributed generation, or renewable alternatives, such as wind. Such a massive expansion of nuclear power also would divert capital resources from investments in other faster and more easily deployed alternatives for reducing world CO2 emissions.