|Time Complexity of Sorting Algorithms|
We'll learn how various sorting algorithms performs when input size grows. An algorithm efficiency is measured by their time complexity. So let's compare the time complexity of various algorithms.
What is Time Complexity ?In computer science, The time complexity of an algorithm signifies the total time taken by the program to complete it's execution. The time complexity of an algorithm is commonly expressed in big O notation.
Common Time Complexities
O(1) - Constant Time
O(1) mean an algorithm will execute in constant time no matter how large input is. Hash functions are the perfect example of constant time.
Find first non-repeating character in a string using hashmap
O(log n) - Logarithmic Time.
O(log n) means the operations per instance required to complete decreases with each instance. For example - Binary Search, Binary Tree etc.
O(n) - Linear Time.
O(n) means algorithm performance is directly proportional to the input size (n). For example - when you traversing a string or an array. Linear search is a perfect example.
O(n2) - Quadratic Time.
O(n2) means algorithm performance is directly proportional to the square of the size of input data. Nested for loops are the perfect example of this category.
Time Complexity of Sorting Algorithms
Let's check the time complexity of mostly used sorting algorithms.
Time Complexity of Bubble Sort
Bubble Sort is a simple sorting algorithm. It works by comparing each pair of elements and switching their positions if necessary. It repeats this process until all the elements are sorted.
Average case time complexity of bubble sort - O(n2)
Worst case time complexity of bubble sort - O(n2)
Best case time complexity of bubble sort - O(n)
Best case occurs when list is already sorted. Bubble sort is not good for sorting when n (input size) is large.
Time Complexity of Selection Sort
Selection sort is an in-place comparison sorting algorithm. In this algorithm, we pick an element and move the element to it's correct position. This process is repeated until all the element is sorted.
Selection sort algorithm and it's implementation.
Average case time complexity of selection sort - O(n2)
Worst case time complexity of selection sort - O(n2)
Best case time complexity of selection sort - O(n2)
Like bubble sort, it is also inefficient for large data sets. Selection sort is known for it's simplicity and it has performance advantages over more complicated algorithms in certain situations, particularly where auxiliary memory is limited.
Time Complexity of Insertion SortInsertion sort algorithm works by taking one element in each iteration and put it at correct location. At each iteration, it insert the element at it's correct location. This process is repeated until no input element remains.
Insertion sort algorithm and it's implementation.
Best case time complexity of insertion sort - O(n)
Average and worst case time complexity of insertion sort - O(n2)
Time Complexity of QuickSort
QuickSort is a divide and conquer algorithm. In QuickSort, large arrays are divided into smaller sub-arrays and then we recursively sort the smaller sub-arrays.
Best and Average time complexity of quick sort - O(n log n)
Worst case time complexity of quick sort - O(n2)
Time Complexity Of Merge Sort
MergeSort is a Divide and Conquer algorithm. It divides an input array in two halves, calls itself for the two halves and then merges the two sorted halves.
Best and Average time complexity of merge sort - O(n log n)