Big O notation is a way to measure the time and space complexity of an algorithm. It is used to compare the efficiency of different algorithms by measuring the number of operations each algorithm performs in the worst-case scenario.
The worst-case scenario is the scenario in which an algorithm takes the most time to complete. The Big O notation is used to describe the worst-case scenario. For example, if an algorithm takes N ** 2 operations to complete in the worst-case scenario, we say that the algorithm has a time complexity of O(N ** 2).
The Big O notation is usually used to describe the time complexity of an algorithm, but it can also be used to describe the space complexity. The space complexity is the amount of memory an algorithm uses to complete a task.
In general, we want to use algorithms with the lowest possible time and space complexity, as they are more efficient and faster.
We also have the Omega that is the best scenario, Beta the average scenario and Big O (most used) is the worst scenario
The worst case where the code takes N times to finish, most common in “simple” loops
def print_items(n):
for i in range(n):
print(i)
print_items(10)
The worst case where the code takes N ** 2 times to finish, most common with loop inside loop
If N = 10 then will be 100 interactions to finish
def print_items(n):
for i in range(n):
for j in range(n):
print(i, j)
print_items(10)