Analysis of Algorithms

The theoretical study of computer-program performance and resource usage.

The objective is make programs fast and the factors that influence them including resources (RAM, SSDs).

But before analysing the factors, we need to figure out what is more important than performance?

  1. User friendliness
  2. Security
  3. Correctness
  4. Simplicity
  5. Maintainability
  6. Scalability
  7. Precision
  8. Robustness
  9. Features
  10. Modularity

Then, why study algorithms and performance ?

  1. Performance is co-related with user friendliness, nothing is more frustrating than a slow loading web-page.
  2. Real time constraints, sometime they do not work if they do not function adequately.
  3. Divides the line between feasible and non-feasible.
  4. Algorithms give language for program behaviour.

Performance is like a currency to buy other things.

If we want to analyse running time of an algorithm like Insertion Sort, it depends on a lot of things.

  • Depends on input (Eg. for an already sorted array, a sort like insertion sort has very less work to do)
  • Depends on the input size (6 elements vs 6*10^9 elements)
    • Parameterise the input size, time as a function of size to see the behaviour with increasing size.
  • Want upper bounds to guarantee the user the maximum amount of time an algorithm is going to run.

Kinds of analysis:

There are usually three kinds of analysis we can do for an algorithm.

Worst-case (usually):

T(n) = Maximum time on any input size of n

Average-case (sometimes):

T(n) = Expected time over all input of size n (weighted average)

To calculate average we need to know the probability of input, since we do not know that we need assumption of statistical distribution (Every input of size n is equally likely)

Best-case (never):

This is no good since we can cheat the analysis by taking a poor algorithm which performs better only for a specific input which is the best case but does not tell anything about other vast majority of inputs. A slow algorithm that works fast on some input.

So what is the worst-case of an algorithm?

Depends on computer, whether it is a super computer or a wrist watch.

We also check for relative speed, compare two algorithms on same machine to check the performance.

We can also check for absolute speed, one algorithm always perform better than the other no matter what machine we run on.

This gets confusing about how we can talk about the worst-case of an algorithm by only discussing the software and not discussing hardware since software’s performance is dependent on hardware. An algorithm on faster machine will always run faster than compared to slow machine.

BIG IDEA ! Asymptotic Notation Analysis

Asymptotic analysis helps us analyse algorithms and make things simpler.

  1. Ignore machine dependent constants.
  2. Look at growth of the running time instead of running time itself. T(n) as n -> ∞

Asymptotic notation:

Θ – notation (Theta notation) :

  • Drop low order terms
  • Ignore leading constants

Ex: 3n³ + 90n² – 5n + 6046 = Θ(n³)

As n -> ∞, Θ(n²) algorithm always beats a Θ(n³) algorithm. Even if we run Θ(n²) algorithm on a slower computer and Θ(n³) on fast computer.

So asymptotic notation helps us to resolve the issue of both relative and absolute speed.

On different machines, we may get different machine constants, but the growth as the size of the input gets larger, the asymptotic wont change.

Screen Shot 2016-08-12 at 10.06.11 pm

References

  1. Introduction to Algorithms
  2. Introduction to Algorithms video lecture – Charles E. Leiserson
Advertisements