**Table of contents**

2. The complexity of the algorithm

3.1 Definition of time complexity

3.2 Asymptotic Notation for Big O

5. Comparison of time complexity and space complexity

# 1. What is an algorithm

Algorithm is a description of the steps to solve a particular type of problem. It is a specified finite sequence, which literally means a method for computing.

## Algorithm characteristics

Finiteness:

An algorithm will always execute a finite number of times and then stop.Deterministic:

Each step has a deterministic meaning and has the same output for the same input.Inputs:

An algorithm has zero or more inputs.Outputs:

An algorithm has one or more outputs.Feasibility:

Each step in the algorithm can be implemented by a finite number of runs of the basic operations that have already been implemented.

# 2. The complexity of the algorithm

Usually the complexity of the algorithm is used to measure the quality of an algorithm. So, what is complexity?

After the algorithm is written into an executable program, it needs **time resources and space (memory) resources** to run . Therefore, to measure the quality of an algorithm, it is generally measured from two dimensions of **time and space** , namely **time complexity and space complexity** .

**Time complexity mainly measures how fast an algorithm runs, while space complexity mainly measures the extra space required to run an algorithm.**

# 3. Time complexity

## 3.1 Definition [of time complexity]

The time complexity of an algorithm is a function that quantitatively describes the running time of the algorithm. (Note: The execution time of an algorithm cannot be calculated theoretically, but the time spent by the algorithm is proportional to the number of executions of the statements in it, so the number of executions of the basic operations in the algorithm is the time complexity of the algorithm)

In layman’s terms: the time complexity is the formula of the general formula that is executed the most times in an algorithm, and the coefficient is omitted and the formula with the highest order is reserved.

## 3.2 Asymptotic Notation for Big O

To derive the big-O method:

1. Replace all additive constants in runtime with the constant 1.

If the execution times of an algorithm is a certain constant term, then its time complexity is O(1)

2. In the modified running times function, only the highest-order term is retained.

3. If the highest-order term exists and is not 1, remove the constant multiplied by this term. The result is a big-O order.

for example:

int fun(int x) { int count = 0; for (int i = 0; i < x; i++) { for (int j = 0;j<x;j++) { count++; } } }

Take this algorithm as an example, count++, the number of executions is x*x times, then its time complexity is O(x^2)

void Func1(int N) { int count = 0; for (int i = 0; i < N; ++i) { for (int j = 0; j < N; ++j) { ++count; } } for (int k = 0; k < 2 * N; ++k) { ++count; } int M = 10; while (M--) { ++count; }

In this example, the code segment that runs the most is ++count, and it can be seen that its execution times are

F(n)=n*n+2*n+10; applying the asymptotic representation of big O, we keep the highest order n*n=n^2, so the time complexity of the above algorithm is O(n^2).

So why is that so? **Because when n is infinite, except the highest order has a great influence on the whole, other constants and first-order terms are negligible for the overall order, and the time complexity is the worst result** , so the time complexity has a big O progressive notation.

# 4. [Space complexity]

## definition

Space complexity is also a mathematical expression, not how many bytes of space a program occupies, but a measure of the amount of storage space temporarily occupied byan algorithm during its operation .

The space complexity calculation rules are basically similar to the practical complexity, and also use the big-O asymptotic notation.

Note: The stack space (storage parameters, local variables, some register information, etc.) required by the function to run has been determined during compilation, **so the space complexity is mainly determined by the additional space explicitly requested by the function at runtime.**

## Example:

void BubbleSort(int* a, int n) { assert(a); for (size_t end = n; end > 0; --end) { int exchange = 0; for (size_t i = 1; i < end; ++i) { if (a[i - 1] > a[i]) { Swap(&a[i - 1], &a[i]); exchange = 1; } } if (exchange == 0) break; } }

In the above algorithm: **n, exchange, i need to apply for additional space to complete the program at runtime, so 3 additional spaces are applied, which are constants, so the space complexity expressed in big O asymptotic notation is O(1)**

# 5. Comparison of time complexity and space complexity

Note: Time is cumulative and never returns; space can be recycled.

Here we use the recursive algorithm of Fibonacci numbers to compare the time complexity and space complexity