sending an email through gmail using python

Brief Overview of Algorithm Analysis.

Concept and definition:

An algorithm is a finite set of precise instructions executed in finite time for performing a computation or for solving a problem”. To represent an algorithm we use pseudo-code. Pseudo-code represents an algorithm in a clear understandable manner in like English language and gives the implementation view as like programming
language.
Example: Write an algorithm for finding the factorial of a given number.

fact(n)
{
fact=1;
for(i=1;i<=n; i++)
fact=fact*i;
return fact;
}

This algorithm is for getting the factorial of n. The algorithm first assigns value 1 to the variable fact and
then until n is reached, the fact is assigned a value fact*I where the value of i is from 1 to n. At last, the value fact is returned. The pseudo-code used here is loosely based on the programming language C.

Characteristics (properties) of the algorithm:

Input: An algorithm has input values from a specified set.
Output: From each set of input values, an algorithm produces output values from a specified set. The
output values are the solution to the problem.
Definiteness: The steps of an algorithm must be defined precisely.
Correctness: An algorithm should produce the desired output after a finite (but perhaps large) number
of steps for any input in the set.
Effectiveness: It must be possible to perform each step of an algorithm exactly and in a finite amount of
time.
Generality: The devised algorithm should be capable of solving a problem of a similar kind for all
possible inputs.

Complexity of algorithms:

When an algorithm is designed, it must be analyzed for its efficiency i.e. complexity. The complexity of
an algorithm is defined in terms of computational resources needed by the algorithm. There are two kinds of computational resources used by an algorithm:

  1. CPU’s processing power (TIME COMPLEXITY)

2. computer’s primary memory (SPACE COMPLEXITY).


The measure of time required by an algorithm to run is given by time complexity and the measure of
space computer memory) required by an algorithm is given by space complexity. Since the actual time required may vary from computer to computer (e.g. on a supercomputer it may be a million times faster than on a PC), we have to use the number of steps required by the algorithm for a given set of inputs to measure the time complexity.

Big-Oh (O) notation:

The complexity of an algorithm is analyzed in terms of a mathematical function of the size of the input.
The complexity analysis (TIME) of an algorithm is very hard if we try to analyze it exactly. So we need the
concept of asymptotic notations. Big-O notation is one of the asymptotic notation used for the complexity analysis of an algorithm.

Definition:

Let f and g be functions from the set of integers or the set of real numbers to the set of real numbers. We say that f(n) is O(g(n)) (read as f(n) is big-oh of g(n)) if there are constants C and k such that |f(n)| ≤ C*|g(n)| whenever n > k.


Worst, Best, and Average Case Complexity:
The worst-case complexity of an algorithm is defined as the maximum number of steps taken on any instance of size n.
The best-case complexity of an algorithm is defined as the minimum number of steps taken on any instance of size n.
The average-case complexity of an algorithm is defined as the average number of steps taken on any random instance of size n.

Note: In the case of running time, worst-case time complexity indicates the longest running time performed
by an algorithm for any input of size n, and thus this guarantees that the algorithm finishes on time.
Hence, algorithms are analyzed for their efficiency mostly in terms of their worst-case complexity.