Full width home advertisement

Welcome Home


Post Page Advertisement [Top]

How Quickly Do Algorithms Improve? 



Algorithms are similar to a parent in the context of a computer. These individuals instruct the computer on how to make sense of information so that they can, in turn, create something useful out of the information.


The algorithm's efficiency determines how much work the computer is required to perform. Although computer performance has improved dramatically as a result of technological advancements in computing hardware and the much debated lifespan of Moore's Law, this is only one aspect of the overall picture.


A second trend is taking place behind the scenes: algorithms are being improved, resulting in less computing power being required in the process. The importance of algorithmic efficiency may be underappreciated, but you'd notice if your trusted search engine suddenly became one-tenth the speed it used to be, or if navigating large datasets felt like wading through thick sludge.


This prompted researchers at MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) to ask the question: How quickly do algorithms improve?


Previously available information on this topic was largely anecdotal, consisting of case studies of specific algorithms that were assumed to be representative of the broader scope of the problem. Faced with a dearth of evidence, the team set out to crunch data from 57 textbooks and more than 1,110 research papers in order to trace the evolution of algorithms over time. Some of the research papers directly reported how good new algorithms were, while others required the authors to reassemble the algorithms from "pseudocode," which are simplified versions of the algorithm that only describe the most fundamental details.


There were 113 "algorithm families" examined in total by the team, which were collections of algorithms that solved the same problem that had been highlighted as the most important by computer science textbooks. The team reconstructed the history of each of the 113 problems, noting each time a new algorithm was proposed for the problem and making special note of the algorithms that were more efficient than the previous one. Averaging eight algorithms per family, the team discovered two that improved its efficiency. The algorithms ranged in performance and were separated by decades, beginning in the 1940s and continuing until now. Algorithm-Wiki.org was created by the team in order to disseminate the knowledge that had been gathered.


The scientists tracked the progress of these families over time, focusing on the most-studied feature of the algorithms: how quickly they could guarantee to solve the problem (known as "worst-case time complexity" in computer speak). What emerged was a great deal of variability, but it also revealed some important insights into how algorithmic improvement has had a transformative effect on computer science.


The researchers found that for large computing problems, 43 percent of algorithm families experienced year-on-year improvements that were equal to or greater than the much-heralded gains from Moore's Law. In 14 percent of problems, the improvement in performance brought about by algorithms outpaced the improvement brought about by improved hardware by a wide margin. Because the benefits of algorithm improvement were particularly large for big-data problems, the significance of algorithm improvements has increased significantly in recent decades.


The single most significant change observed by the authors occurred when an algorithm family transitioned from exponential to polynomial complexity, as described in the paper. A person trying to guess the combination of a lock represents the amount of effort required to solve an exponential problem. You can complete the task with just one 10 digit dial if you have one. Because a bicycle lock has four dials, it is difficult enough that no one will steal your bike, but it is still possible that you will try every combination. It's nearly impossible with 50 people because it would require far too many steps. Problems with exponential complexity are analogous to the following for computers: Their size increases exponentially over time, and they soon outgrow the computer's ability to handle them. Finding a polynomial algorithm frequently solves this problem, allowing you to tackle problems in a way that no amount of hardware improvement could ever accomplish.


Moore's Law is coming to an end, and the rumblings of this are quickly spreading throughout global conversations, according to the researchers, who predict that computing users will increasingly turn to areas like algorithms for performance improvements. The findings, according to the team, confirm that historically, the gains from algorithms have been enormous, indicating that there is significant potential. However, if the gains come from algorithms rather than hardware, they will appear in a different way. Hardware improvement as a result of Moore's Law occurs gradually over time, whereas algorithmic improvements occur in discrete steps that are typically large but infrequent.


Mr. Thompson, who is also an MIT research scientist at the Center for Advanced Industrial Leadership (CSAIL) and the Sloan School of Management and is a senior author on the new paper, says, "This is the first paper to demonstrate how quickly algorithms are improving across a broad range of examples." Through our research, we were able to determine how many more tasks could be completed with the same amount of computing power after an algorithm was improved. As problems grow in size to include billions or trillions of data points, algorithmic improvement overtakes hardware improvement as the most important factor. When the environmental impact of computing is becoming increasingly concerning, this is a way to improve businesses and other organizations without having to worry about the negative consequences.

No comments:

Post a Comment

Bottom Ad [Post Page]