## Algorithm design:

- Pseudo-code and Flowchart
- Termination and correctness
- Complexity or in PDF
- Big oh and Time complexity
- see software design

## Paradigms:

See the following lecture for more information: decision making

## Introduction

Algorithms have always existed, it is a thought process described in a logical way. Although this system seems to us today indispensable for many sciences, the systematization of algorithmic writing has evolved over time and technologies. It began at the time of the Babylonians, until concretely established by René Descartes in the general method proposed by *Discours de la méthode* in 1637.

The modern algorithmic, including a language called « program » made its debut soon after in 1677. The term that we use today, « algorithm », appears two centuries later. Twentieth and twenty-first century algorithms are often based on formalisms like Turing machines, which provide a scientific framework for studying the properties of algorithms.

With computer science, algorithmics have been developed a lot in the second half of the twentieth century and Donald Knuth, author of the treatise *The Art of Computer Programming*, laid a rigorous mathematical foundation for their analysis.

Many formal or theoretical tools have been developed to describe the algorithms, to study them, to express their qualities, to be able to compare them:

- algorithmic structures: control structures and data structures.
- notions of correctness, completeness and termination.
- complexity’s theory

## Algorithmic structures

This flow of instructions is also known as the execution flow of a program and can be represented by a control flow graph or a PLC. A certain programming paradigm that uses multiple processors exposes itself to the problem of assigning tasks in the processors.

Each computer language has a type of control structure and data structure. The writing of an algorithm is very often based on widely used languages such as C, Java, etc.

## Correctness, completeness and termination

Termination is the assurance that the algorithm will finish in a finite time. Termination evidence usually involves a strictly decreasing positive integer function at each step/iteration of the algorithm.

Given the guarantee that an algorithm will end, the proof of correction must provide the assurance that if the algorithm ends by giving a solution, then this solution is correct (in the sense that it answers the problem).

The proof of completeness guarantees that the algorithm give a solution for each of the entries of the algorithm, for a given problem space.

**Termination:** the algorithm has a complexity in time.

**Correction:** any calculated response of the algorithm is solution of the problem.

**Completeness:** The algorithm can calculate a response for any instance of the problem.

## Complexity

algorithm: the « natural » performance of an algorithm outside the environment in which it will be implemented.

There are several types of analysis for the performance of an algorithm:

- Optimistic analysis (analysis in the most favorable case or analysis in the best case).
- The average analysis is a factor in the programming paradigm used.
- Pessimistic analysis (worst-case analysis or worst-case analysis).

Complexity classes are sets of problems that have the property of having all the same complexity according to a certain criterion. The best-known classes are classes defined on Turing machines (deterministic or non-deterministic – see the course on automata) with time and space constraints. There is a relationship between algorithmic complexity and its energy cost of operation, called the Landauer principle.

Here is a list of the most frequently encountered complexities:

Notation | Class |
---|---|

constant complexity: an assignment, a basic calculation, etc. | |

logarithmic complexity: dichotomy, tree | |

linear complexity: a loop | |

quasi-linear complexity: divide and conquer, tree | |

quadratic complexity: nesting two loops | |

cubic complexity: nesting three loops | |

polynomial complexity: nesting sereval loops | |

quasi-polynomial complexity: divide and conquer | |

exponential complexity: tree, brute force | |

factorial complexity: naive approach to combinatorial enumeration |