Scientists at MIT provide the first systematic and quantitative proof that algorithms are one of the most important sources of improvement in computing.
MIT Scientists show how quickly algorithms improve in various examples, demonstrating the critical importance of algorithms in computer advancements.
Algorithms are like the parents of a computer. They teach computers how to understand the meaning of information, and then allow them to do something useful with it.
The more efficient the algorithm, the less the computer has to work. For all of the technological advancements in computer hardware and for the many discussed lifetimes of Moore’s Law, computer performance is only one aspect of the big picture.
Behind the scenes, the second trend is happening. The improved algorithm requires less computing power. The efficiency of the algorithm may not get a lot of attention, but if a reliable search engine suddenly slows down by a tenth, or if moving large data sets feels like it’s going through mud. You will definitely notice.
As a result, scientists at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) asked: How quickly will the algorithm improve?
The existing data on this issue was largely anecdotal and consisted of case studies of specific algorithms believed to represent a wider range. Faced with this lack of evidence, the team set out to process data from 57 textbooks and over 1,110 research treatises to trace the history of algorithm improvement. Some research treatises directly report the quality of the new algorithm, while others have been reconstructed by the author using “pseudocode”, an abbreviated version of the algorithm that explains the basic details. I had to.
The team examined a total of 113 “algorithm families”. This is a set of algorithms that solve the same problems that have been highlighted as the most important in computer textbooks. For each of the 113, the team reconstructed its history, tracked each time a new algorithm was proposed for the problem, and paid particular attention to a more efficient algorithm. From the 1940s to the present day, the team has a wide performance spectrum, broken down into decades, and has found an average of eight algorithms per family, two of which improve efficiency. To share this assembled knowledge database, the team also created Algorithm-Wiki.org.
Scientists have illustrated how quickly these families have improved, focusing on the most analyzed function of the algorithm, how quickly it can be guaranteed to solve the problem (IT speaking). For example, “worst-case time complexity”). What emerged was not only great variability, but also important information about how transformative algorithmic improvements were made to computing.
For large-scale computational problems, 43% of the algorithm family saw improvements equal to or greater than the many advantages of Moore’s Law. In 14% of the problems, the performance gains from the algorithms far outweighed the performance gains from the hardware gains. These advancements have become more important in recent decades, as the benefits of algorithmic improvements have been particularly important for big data issues.
The single most significant change observed by the author occurred when the family of algorithms went from exponential complexity to polynomial complexity. The effort required to solve an exponential problem is like a person trying to guess a lock combination. If you only have one 10-digit number, it’s easy. With four dials like a bike lock, no one can steal your bike, but chances are you can always try all the combinations. At 50, that’s almost impossible – it takes too many steps. Problems of exponential complexity are similar to those of computers. As problems multiply, computers quickly exceed their ability to handle them. Finding a polynomial algorithm often solves this, allowing you to solve the problem in a way that cannot be improved by hardware.
As the roar of Moore’s Law, which comes to an end, quickly permeates global conversations, researchers must increasingly look into areas such as algorithms to improve the performance of computer users. Is called. According to the team, the results confirm that historically the benefits of the algorithm have been huge, so the possibility is there. But if they benefit from algorithms rather than hardware, they are different. Moore’s Law hardware improvements run smoothly over time, and algorithms typically benefit from significant but rare milestones.
“This is the first article to show how quickly the algorithm improves in a wide range of examples,” said Neil Thompson, researcher at MIT and lead author of the new article at CSAIL and the Sloan School of Business. I am. “Through our analysis, we were able to tell how many more tasks could be done with the same amount of computing power after improving the algorithm. The problem is billions or trillions of data. As this increases, improvements in algorithms are practically more important than improvements in hardware. In an age of growing concern about the footprint of the IT environment, this has no problem for businesses or other organizations. It’s a way to improve. “
See also: “How fast is the algorithm improving?” »Yash Sherry and Neil C. Thompson, September 20, 2021 IEEE minutes..
DOI: 10.1109 / JPROC.2021.3107219
Thompson wrote a treatise with Yash Sherry, a visiting student at MIT. The article is published in IEEE minutes.. This work was funded by the Tides Foundation and the MIT Digital Economy Initiative.
How fast are algorithms improving? Source link How fast are algorithms improving?