Computer scientists tackle performance regressions with new tool

Researchers from Texas A&M University have teamed up with computer scientists from Intel Labs to create a tool that will help identify the source of software bugs. According to the researchers, software updates are supposed to make applications run faster, but sometimes they end up doing the opposite. Bugs known as performance regressions pop up in software updates and are time-consuming to fix, mainly because locating them requires a lot of human involvement.

“Updating software can sometimes turn on you when errors creep in and cause slowdowns. This problem is even more exaggerated for companies that use large-scale software systems that are continuously evolving,” said Abdullah Muzahid, assistant professor in the Department of Computer Science and Engineering at Texas A&M. “We have designed a convenient tool for diagnosing performance regressions that is compatible with a whole range of software and programming languages, expanding its usefulness tremendously.”

RELATED CONTENT:
DeepCode reveals the top security issues plaguing software developers
How to save your failing software development initiatives

The new solution will offer an automated way of finding these software bugs, the researchers explained. The program uses deep learning in order to monitor large amounts of data coming in from performance counters, which are lines of code that monitor how a program is being executed. The researchers compare this process to compressing a high-resolution image. 

To test their algorithm, the researchers trained it on an older, glitch-free application to teach it to recognize normal counter data. Then, they ran the algorithm on an updated version of the software that contained a performance regression. The algorithm successfully located and diagnosed the bug in just a few hours. 

The researchers believe their algorithm could have other use cases, such as developing technology for autonomous vehicles. “The basic idea is once again the same, that is being able to detect an anomalous pattern,” Muzahid said. “Self-driving cars must be able to detect whether a car or a human is in front of it and then act accordingly. So, it’s again a form of anomaly detection and the good news is that is what our algorithm is already designed to do.”

This research was initially presented at the Neural Information Processing Systems conference in December. Other researchers involved in creating this tool included Mejbah Alam, Justin Gottschlich, Nesime Tatbul, Javier Turek and Timothy Mattson from Intel Labs. In addition, the research was partly funded by the National Science Foundation CAREER grant and Intel. 

The post Computer scientists tackle performance regressions with new tool appeared first on SD Times.