Big O notation is special notation that describes the performance or efficiency of an algorithm. In other words, it is a way to tell how fast an algorithm is.
Big O notation allows us to describe how the runtime of an algorithm grows as the input size increases, giving us a way to compare and analyze different algorithms based on their efficiency.
Let's have a look at a list of the most common running times for algorithms using Big O notation, along with examples...