
Good review: https://www.sahinarslan.tech/posts/step-by-step-big-o-complexity-analysis-guide-using-javascript
Time Complexity → The amount of time that it will take a computer to finish running an algorithm (in relation to input).
Space Complexity →The amount of memory required to solve the problem in relation to input size.
Problems can be broke into best case, worst case, and average case. The complexities can vary between each of these. Different notations, as well.
Big O notation (most common, displayed above) - upper bound of an algorithm
Omega notation - best case scenario
Theta notation - average case complexity
Constant Time (O(1)) → It will always take the same amount of time, regardless of input size.
Logarithmic Time (O(log n)) → When it reduces the size of the input data in each step. This can be found in binary trees/binary search.
Linear Time (O(n)) → When the running time increases linearly with the size of data. All data must be accessed in the algorithm.
Quasilinear Time (O(n log n)) → When each component of the algorithm has linear time complexity. This is found in many sorting algorithms like mergesort.
Quadratic Time (O(n²)) → When a linear time operation is needed for each piece of the data. This can be seen in an algorithm like bubble sort.
Exponential Time (O(2^n)) →When the time taken doubles with each addition of data. This is seen in brute-force algorithms. Another example of this is in the recursion function of calculating the Fibonacci numbers!
Factorial (O(n!)) →When the time taken grows in a factorial way with the addition of data.

