A University of Texas at Arlington (UTA) assistant professor is developing a new algorithm to overcome barriers plaguing power grids in the U.S.
UTA’s Ramtin Madani was recently awarded a $325,000 grant from the National Science Foundation (NSF) to develop massively scalable computational methods for power scheduling.
Grid optimization focuses on efficiency in producing and distributing power, security and reliability so that consumers aren’t hit with massive blackouts or cascading failures.
The biggest challenge to optimization is scalability because even in the smallest grid, it is difficult to find a system to route power.
Finding the proper on/off switches as well as determining what generators should operate the following day in the most efficient way possible, are other issues plaguing the industry.
Madani and co-principal investigator Ali Davoudi are planning to cast such problems in a common math language called optimization theory, which will allow them to look at all of the problems from a common standpoint.
Once this is done, Madani and Ali will develop and introduce algorithmic tools and techniques to address those problems through optimization theory and mathematical programming.
The final approach will be to leverage such tools and techniques for addressing practical, everyday problems in the area of power systems.
“Optimization will reduce the cost of producing power. If done correctly, it will save billions of dollars annually, which is why it is so important to find a solution,” said Madani in conclusion.
Image and content: ABB/Jeremy Agor-University of Texas at Arlington