Data structure (1)-time complexity problem

time complexity

Get the number of executions of the algorithm function T(n) ==> get the time complexity of the algorithm

1. We know that the constant term has little effect on the growth rate of the function, so when T(n) = c and c is a constant, we say that the time complexity of this algorithm is O(1); if T(n ) When it is not equal to a constant term, directly omit the constant term.

比如
 T(n) = 2,(算法)的时间复杂度为 O(1)。
T(n) = n + 29,此时时间复杂度为 O(n)。

2. We know that higher-order terms have the greatest impact on the growth rate of the function. The growth rate of n^3 is much faster than n^2, and the growth rate of n^2 is much faster than n. At the same time, because the required accuracy is not high, we directly ignore the low item.

比如
T(n) = n^3 + n^2 + 29,此时时间复杂度为 O(n^3)。

3. Because the order of the function has the most significant effect on the growth rate of the function, we ignore the constant multiplied by the highest order.

比如
T(n) = 3n^3,此时时间复杂度为 O(n^3)。

To put it all together: If the number of executions of an algorithm is T(n), then only the highest order term is retained, and the coefficient of the highest term is ignored at the same time to obtain the function f(n). At this time, the time complexity of the algorithm is O(f(n) )

Small example

1. For a loop, suppose the time complexity of the loop body is O(n) and the number of loops is m, then
the time complexity of this loop is O(n×m).

void aFunc(int n) {
    for(int i = 0; i < n; i++) {         // 循环次数为 n
        printf("Hello, World!\n");      // 循环体时间复杂度为 O(1)
    }
}

At this time, the time complexity is O(n × 1), that is, O(n).

2. For multiple loops, assuming that the time complexity of the loop body is O(n), and the loop times of each loop are a, b, c..., then the time complexity of this loop is O(n×a×b× c...). When analyzing, these cycles should be analyzed from the inside out.

void aFunc(int n) {
    for(int i = 0; i < n; i++) {         // 循环次数为 n
        for(int j = 0; j < n; j++) {       // 循环次数为 n
            printf("Hello, World!\n");      // 循环体时间复杂度为 O(1)
        }
    }
}

At this time, the time complexity is O(n × n × 1), that is, O(n^2).

3. For statements or algorithms that are executed sequentially, the total time complexity is equal to the largest time complexity.

void aFunc(int n) {
    // 第一部分时间复杂度为 O(n^2)
    for(int i = 0; i < n; i++) {
        for(int j = 0; j < n; j++) {
            printf("Hello, World!\n");
        }
    }
    // 第二部分时间复杂度为 O(n)
    for(int j = 0; j < n; j++) {
        printf("Hello, World!\n");
    }
}

At this time, the time complexity is max(O(n^2), O(n)), that is, O(n^2).

4. For conditional judgment statements, the total time complexity is equal to the time complexity of the path with the largest time complexity.

void aFunc(int n) {
    if (n >= 0) {
        // 第一条路径时间复杂度为 O(n^2)
        for(int i = 0; i < n; i++) {
            for(int j = 0; j < n; j++) {
                printf("输入数据大于等于零\n");
            }
        }
    } else {
        // 第二条路径时间复杂度为 O(n)
        for(int j = 0; j < n; j++) {
            printf("输入数据小于零\n");
        }
    }
}

At this time, the time complexity is max(O(n^2), O(n)), that is, O(n^2).

Exercise

Basic questions

void aFunc(int n) {
    for (int i = 0; i < n; i++) {
        for (int j = i; j < n; j++) {
            printf("Hello World\n");
        }
    }
}

Answer: T(n) =n+n-1+n-2…+1=n(n+1)/2
O(n) = n 2

Advanced questions

void aFunc(int n) {
    for (int i = 2; i < n; i++) {
        i *= 2;
        printf("%i\n", i);
    }
}

Answer: Assuming that the number of cycles is t, the cycle condition satisfies 2^t <n.
It can be concluded that the number of executions t = log(2)(n), that is, T(n) = log(2)(n), it can be seen that the time complexity is O(log(2)(n)), that is O(log n).

Advanced again

long aFunc(int n) {
    if (n <= 1) {
        return 1;
    } else {
        return aFunc(n - 1) + aFunc(n - 2);
    }
}

Answer: Obviously the number of runs, T(0) = T(1) = 1, and at the same time T(n) = T(n-1) + T(n-2) + 1, where 1 is one execution of addition .
Obviously T(n) = T(n-1) + T(n-2) is a Fibonacci sequence. It can be proved by inductive proof that when n >= 1, T(n) <(5/3) ^n, and when n> 4, T(n) >= (3/2)^n.
So the time complexity of this method can be expressed as O((5/3)^n), which is simplified as O(2^n).

Guess you like

Origin blog.csdn.net/Touale/article/details/112546623