Amdahl's Law has been used by computer scientists for more than 55 years, and it continues to impact the way we design and optimize our computing systems today. It's a principle used to provide guidance when making decisions regarding the optimization of system performance, often in the realm of parallel computing.
The basis of the law states that the maximum potential improvement in speed of a program or system is limited by its most significant bottleneck which is the portion of the system or program that takes the longest to complete.
Amdahl's Law is popular among developers because it offers a simple and efficient way to determine the maximum potential for improving system performance. By identifying the bottleneck in a program or system, developers can focus their efforts on optimizing that particular component instead of wasting time working on parts that will have minimal returns.
Parallel processing & the origin of Amdahl's Law
Amdahl’s Law is named after Gene Myron Amdahl, an American computer scientist who focused on enhancing computer performance using different methods, including parallel processing. He presented at the American Federation of Information Processing Societies (AFIPS) Spring Joint Computer Conference in 1967, where he discussed his methods of anticipating the constraints of parallel computing.
Parallel computing is when multiple processors, cores, and computers are used to perform many tasks or calculations simultaneously. This helps to speed up data processing and computation allowing for greater scalability than traditional methods.
However, adding more processors to a system does not automatically make things faster with parallel computing. First you need to identify the limitations of the system as a whole. In that presentation in 1967, Amdahl outlined just how to do so — a process that would henceforth be known as Amdahl’s Law.
What is Amdahl's Law?
Although it was presented back in 1967, Amdahl's Law remains a crucial basis for numerous computing principles and technologies and continues to be utilized widely to this day.
Amdahl's Law is a formula that predicts the potential speed increase in completing a task with improved system resources while keeping the workload constant. The theoretical speedup is always limited by the part of the task that cannot benefit from the improvement.
Amdahl's Law applies only to cases where the problem size, or workload, is fixed.
Formula for Amdahl's Law
The base formula for Amdahl's Law is S = 1 / (1 - p + p/s), where…
- S is the speedup of a process
- p is the proportion of a program that can be made parallel, and
- s is the speedup factor of the parallel portion.
This formula states that the maximum improvement in speed of a process is limited by the proportion of the program that can be made parallel.
In other words, it does not matter how many processors you have or how much faster each processor may be; the maximum improvement in speed will always be limited by the most significant bottleneck in a system.
Examples of Amdahl's Law
Now let’s look at a couple examples to bring this law to life.
Example 1: Teammate collaboration
Your company has scheduled a meeting with 3 team members first thing in the morning. Each team member needs to get themselves to the office, but they can't start the meeting until everyone has arrived.
Two of the sales members drive themselves into the office, while the third teammate rides their bicycle.
According to Amdahl’s Law, if the team wants to start their meetings earlier, then they need to focus on the performance of the cyclist. Even if one of the drivers drives faster than usual, there's still no way to get around the fact that they have to wait for the bicyclist who is the bottleneck in this situation.
Example 2: Website performance
An online store has just released a new product and they are expecting a huge surge in web traffic. To handle the increased demand, they have decided to add more servers to their system. While additional servers can help them handle the increased traffic, they are limited by how quickly each page is served — that will be the bottleneck of the system.
According to Amdahl's Law, they need to focus on optimizing their webpages rather than simply adding more servers to help with the surge in traffic.
Disadvantages of Amdahl's Law
One of the main disadvantages of Amdahl's Law is that it cannot be applied to situations where a problem size or workload can increase. This is because the formula for Amdahl's Law assumes that the workload and problem size are fixed. Therefore, any improvements in system performance must be due to optimizing components of the system which may be limited in scope.
In addition, Amdahl's Law does not account for any unexpected bottlenecks that may arise during the optimization process, such as our cyclist experiencing a flat tire on their way to the meeting. This means there is no guarantee that the performance improvements achieved will be optimal or even significant enough to justify the effort and resources put into optimizing a system.
Summarizing optimization: maximizing your returns
Amdahl's Law is an important principle in computing that provides guidance when optimizing system performance by identifying the bottleneck in a program or system. This allows developers to focus their efforts on optimizing that particular component instead of wasting time working on parts that will have minimal returns. Understanding this law offers developers an efficient way to maximize their system performance and take advantage of the potential gains from parallel computing.
As we continue to move forward with computing technology, Amdahl's Law will remain a crucial factor to consider when optimizing systems in order to achieve maximum performance.
What is Splunk?
This posting does not necessarily represent Splunk's position, strategies or opinion.