Texas A&M and Intel Labs have developed a new tool to identify the source of errors caused by software updates.
Software updates are intended to make our applications run faster but they inadvertently end up doing just the opposite.
The culprit in question are bugs that the computer science field calls ‘performance regressions’.
Such regressions are very time-consuming to fix; locating software errors normally requires substantial human intervention.
To overcome this, Texas A&M University and Intel Labs have now developed a complete automated way of identifying the source of errors caused by software updates.
Their algorithm, based on deep learning, is not only turnkey but also quick, finding performance bugs in a matter of a few hours instead of days.
To pinpoint the source of errors within a software, debuggers often check the status of performance counters within the central processing unit.
These counters are lines of code that monitor how the program is being executed on the computer’s hardware in the memory.
So, when the software runs, counters keep track of the number of times it accesses certain memory locations, the time it stays there and when it exits, among other things.
When the software’s behavior goes awry, counters are again used for diagnostics.
“Performance counters give an idea of the execution health of the program,” says assistant professor Abdullah Muzahid.
“So, if some program is not running as it is supposed to, these counters will usually have the telltale sign of anomalous behavior.”
Newer desktops and servers have hundreds of performance counters, making it virtually impossible to keep track of all of their statuses manually and then look for aberrant patterns that are indicative of a performance error.
This is where Muzahid’s machine learning comes in.
By using deep learning, Muzahid’s team was able to monitor data coming from a large number of the counters simultaneously by reducing the size of the data.
The process is similar to compressing a high-resolution image to a fraction of its original size by changing its format.
In the lower dimensional data, the new algorithm could then look for patterns that deviate from normal.
Muzahid notes that the deep learning algorithm could also be used in other areas as well, such as developing the technology needed for autonomous driving.
Image and content: Rachel Barton/Texas A&M