Jason Taken
Associate Editor
Loyola University Chicago School of Law, J.D. 2019
I authored a post last year regarding the nuclear energy industry’s current initiative to reduce operational costs to compete with the ever-dropping cost of energy production. Coined “Delivering the Nuclear Promise,” the initiative aims to enlist cost-cutting initiatives such as reducing staffing and removing superfluous requirements that maintain large margin to regulatory thresholds. Companies have set hefty goals to bring the cost of nuclear energy production down to values that would make nuclear energy competitive against less expensive, highly backed, and not-as-clean, forms of energy. This all needs to be done without sacrificing safety. In order to achieve these drastic measures, I will set forth the case for on-the-rise technologies, that while the nuclear energy industry does not currently have the infrastructure to support, will aide in this transition, and as I argue, ultimately be required in order to sustain this clean and necessary form of energy.
The direction nuclear needs to go
“Big data” is driving the way we, as consumers, behave. Given this influence in our behavior, companies, such as retailers, have taken advantage of that data to grow their businesses at exponential rates, save costs on advertising, and wipe those who aren’t using data off the map. The nuclear energy industry is rearing its head to be next in line. The nuclear energy industry needs to adopt this means of thinking if it wants to stand a chance in the ever-changing economy where cost-cutting isn’t just a reaction to the disruption of the economic equilibrium, but rather a standard way of doing business.
The nuclear industry could capitalize on big data by monitoring plant equipment and conducting maintenance. Currently, nuclear power plants are riddled with preventative maintenance schedules that don’t necessarily take into account the status of the equipment, but rather, mandate the replacement of parts and performance of maintenance on that equipment at some routinely-scheduled frequency. This methodology relies on schedules previously made, and does not necessarily take into account the condition of the equipment, past performance, and requires many approvals to change. Even if the equipment is in pristine condition, the maintenance must still be performed.
Most, if not all, nuclear power plants, though, contain decades of data that inform the operators just under what parameters, and under what time frame, equipment previously required repair or replacement. All plants are required to keep pertinent data, in most cases, for the life of the plant, plus 10 years. If anything, nuclear power plants are prime for a disruption in the way they use data. Using this data would eventually inform the operator of the plant when the maintenance was needed, and drive down costs by only performing such maintenance as actually needed. Doing extra maintenance for the sake of doing the work must be a practice we leave behind. The shift toward data-driven maintenance programs (much like a check-engine light on a car) would drastically reduce recurring operations and maintenance costs with little impact on plant operation. Strategic cost savings is only the beginning, and as data begins to drive every other aspect of business, it is the only way should an industry wish to survive.
But there’s a problem.
While adopting a big-data mentality would be a much-needed step in the right direction for nuclear economics, the industry itself faces a few large problems. Under current regulatory mandates, preventative maintenance is a must. Because of the risk involved in keeping a nuclear power plant up and running, taking “chances,” or relying on data to inform of imminent equipment failure, is unlikely to be accepted. Trusting what happened in the past, although reliable, is likely not something plant operators or regulators would adopt.
Further, the use and the digitization of data also face challenges. The digitization of data at nuclear power plants requires extreme scrutiny to conform to various regulations in Title 10 of the Code of Federal Regulations. Specifically with regard to sensitive information about the plant, certain criterion must be satisfied before that data is kept in any location other than the approved vaults and storage facilities. Of course, a move in the data-driven direction would require the data to be digitized and placed online. As it currently stands, placing data on the web in such a way that access to it would be achievable by anyone is not likely a scheme that would be easily sold to anyone informed on the regulations. Accordingly, compliance would be unlikely. A security infrastructure to support such storage and digitization would need to be put in place with the security measures appropriate to keep the data safe, and ultimately, the plant safe. This is achievable though, and as I argue, a necessity.
Being data-driven is the only way moving forward. As companies shift costs to prediction algorithms and spend less on advertising, knowing what a consumer wants when they want it is synonymous to knowing what a piece of equipment needs when it needs it. Past performance indicates future performance to some extent, and big data is making that sentiment truer every day.
For nuclear energy to remain the highly reliable and dependable source of power it is, and to maintain its viability without being ousted from the market by the cost it takes to operate, nuclear energy will have to find ways to operate than it has in the past. Taking on new ways to perform maintenance, and relying on the years of data it has, with the right infrastructure, the nuclear industry can surpass other forms of energy in both cost and reliability.