diff --git a/Algorithms/Sorting/BubbleSort/readme.md b/Algorithms/Sorting/BubbleSort/readme.md index 8aa78a5e..0e7ac9c6 100644 --- a/Algorithms/Sorting/BubbleSort/readme.md +++ b/Algorithms/Sorting/BubbleSort/readme.md @@ -3,13 +3,13 @@ Bubble Sort is the simplest sorting algorithm that works by repeatedly swapping ## Complexity | Best | Average | Worst | Memory | Stable | -|------|---------|-------|--------|--------| -| n | n^2 | n^2 | 1 | Yes | +|:------:|:---------:|:-------:|:--------:|:--------:| +| $O(n)$ | $O(n^2)$ | $O(n^2)$ | $O(1)$ | Yes | -- The worst-case time complexity of bubble sort is O(n x n) = O(n^2) -- The best-case time complexity of bubble sort is O(n). -- The average case time complexity of bubble sort is O(n/2 x n) = O (n2). -- The space complexity of bubble sort is O(1). +- The worst-case time complexity of Bubble Sort is $O(n^2)$. +- The best-case time complexity of Bubble Sort is $O(n)$. +- The average case time complexity of Bubble Sort is $O(n^2)$. +- The space complexity of Bubble Sort is $O(1)$. ## Pseudo Code ``` diff --git a/Algorithms/Sorting/CountSort/readme.md b/Algorithms/Sorting/CountSort/readme.md index c7b51501..f9fba4a9 100644 --- a/Algorithms/Sorting/CountSort/readme.md +++ b/Algorithms/Sorting/CountSort/readme.md @@ -3,13 +3,13 @@ Counting sort is a sorting algorithm that sorts the elements of an array by coun ## Complexity | Best | Average | Worst | Memory | Stable | -|--------|---------|--------|--------|--------| -| O(n+k) | O(n+k) | O(n+k) | O(max) | Yes | +|:--------:|:---------:|:--------:|:--------:|:-------:| +| $O(n+k)$ | $O(n+k)$ | $O(n+k)$ | $O(n+k)$ | Yes | -- The worst-case time complexity of counting sort is O(n^k) -- The best-case time complexity of counting sort is O(n). -- The average case time complexity of counting sort is O(n+k) -- The space complexity of counting sort is O(k). +- The worst-case time complexity of Counting Sort is $O(n+k)$. +- The best-case time complexity of Counting Sort is $O(n+k)$. +- The average case time complexity of Counting Sort is $O(n+k)$. +- The space complexity of Counting Sort is $O(n+k)$. ## Pseudo Code ``` diff --git a/Algorithms/Sorting/HeapSort/readme.md b/Algorithms/Sorting/HeapSort/readme.md index 60fdd092..4b3f6c5c 100644 --- a/Algorithms/Sorting/HeapSort/readme.md +++ b/Algorithms/Sorting/HeapSort/readme.md @@ -3,13 +3,13 @@ Heap Sort is a comparison based sorting technique based on Binary Heap data stru ## Complexity | Best | Average | Worst | Memory | Stable | -|------|---------|-------|--------|--------| -| nlogn | nlogn | nlogn | 1 | No | +|:------:|:---------:|:-------:|:--------:|:--------:| +| $O(n*log(n))$ | $O(n*log(n))$ | $O(n*log(n))$ | 1 | No | -- The worst-case time complexity of heap sort is O(nlogn). -- The best-case time complexity of heap sort is O(nlogn). -- The average case time complexity of heap sort is O(nlogn). -- The space complexity of heap sort is O(1). +- The worst-case time complexity of Heap Sort is $O(n*log(n))$. +- The best-case time complexity of Heap Sort is $O(n*log(n))$. +- The average case time complexity of Heap Sort is $O(n*log(n))$. +- The space complexity of Heap Sort is $O(1)$. ## Pseudo Code ``` diff --git a/Algorithms/Sorting/InsertionSort/readme.md b/Algorithms/Sorting/InsertionSort/readme.md index 82da0659..83b5bbdf 100644 --- a/Algorithms/Sorting/InsertionSort/readme.md +++ b/Algorithms/Sorting/InsertionSort/readme.md @@ -3,8 +3,8 @@ Insertion Sort is a simple sorting algorithm that works the way we sort playing ## Complexity | Best | Average | Worst | Memory | Stable | -|------|---------|-------|--------|--------| -| n | n^2 | n^2 | 1 | Yes | +|:------:|:---------:|:-------:|:--------:|:--------:| +| $O(n)$ | $O(n^2)$ | $O(n^2)$ | $O(1)$ | Yes | ## Pseudo Code ``` diff --git a/Algorithms/Sorting/MergeSort/readme.md b/Algorithms/Sorting/MergeSort/readme.md index 9d3081f7..a23861d1 100644 --- a/Algorithms/Sorting/MergeSort/readme.md +++ b/Algorithms/Sorting/MergeSort/readme.md @@ -1,33 +1,33 @@ # Merge Sort -Merge sort is a sorting algorithm based on the divide and conquer technique. With worst-case time complexity being Ο(n log n), it is one of the most important and commonly used algorithms. +Merge sort is a sorting algorithm based on the divide and conquer technique. With worst-case time complexity being $Ο(n*log(n))$, it is one of the most important and commonly used algorithms. ## Time and Space Complexity | Best | Average | Worst | Memory | Stable | -|------|---------|-------|--------|--------| -| n log(n) | n log(n) | n log(n) | n | Yes | +|:------:|:---------:|:-------:|:--------:|:--------:| +| $O(n*log(n))$ | $O(n*log(n))$ | $O(n*log(n))$ | $O(n)$ | Yes | 1. Space Complexity -Auxiliary Space: O(n) Sorting In Place. +Auxiliary Space: $O(n)$ Sorting In Place. 2. Time Complexity Merge Sort is a recursive algorithm and its time complexity can be expressed with the following recurrence relation for divide and conquer algorithms. -`` - T(n) = aT(n/b) + f(n) - where, - n = size of input - a = 2, we divide our array in half and solve them first, so our number of subproblems is 2 - n/b = n/2, we divide our array in half each time we recurse - f(n) = n, we still have to iterate though the list to check it and then combine the two halves together + - $T(n) = aT(n/b) + f(n)$ + - where, + - $n$ = size of input + - $a = 2$, we divide our array in half and solve them first, so our number of subproblems is 2 + - $n/b = n/2$, we divide our array in half each time we recurse + - $f(n) = n$, we still have to iterate though the list to check it and then combine the two halves together - T(n) = 2T(n/2) + n -`` -The solution for the above recurrence is O(n log(n)). -The list of size n is divided into a max of log(n) parts, and the merging of all sublists into a single list takes O(n) time. Making the worst, best and average-case run time of this algorithm O(n log(n)). + + - $T(n) = 2T(n/2) + n$ + +The solution for the above recurrence is $O(n*log(n))$. +The list of size $n$ is divided into a max of $log(n)$ parts, and the merging of all sublists into a single list takes $O(n)$ time. Making the worst, best and average-case run time of this algorithm $O(n*log(n))$. ## Pseudo Code ``` diff --git a/Algorithms/Sorting/QuickSort/readme.md b/Algorithms/Sorting/QuickSort/readme.md index 18277444..340f3ea6 100644 --- a/Algorithms/Sorting/QuickSort/readme.md +++ b/Algorithms/Sorting/QuickSort/readme.md @@ -3,13 +3,13 @@ Quick Sort is a Divide and Conquer algorithm. It picks an element as pivot and p ## Complexity | Best | Average | Worst | Memory | Stable | -|------|---------|-------|--------|--------| -| nlog(n) | nlog(n) | n^2 | log(n) | No | +|:------:|:---------:|:-------:|:--------:|:--------:| +| $O(n*log(n))$ | $O(n*log(n))$ | $O(n^2)$ | $O(log(n))$ | No | -- The worst-case time complexity of heap sort is O(n^2). -- The best-case time complexity of heap sort is O(nlogn). -- The average case time complexity of heap sort is O(nlogn). -- The space complexity of heap sort is O(logn). +- The worst-case time complexity of Quick Sort is $O(n^2)$. +- The best-case time complexity of Quick Sort is $O(n*log(n))$. +- The average case time complexity of Quick Sort is $O(n*log(n))$. +- The space complexity of Quick Sort is $O(log(n))$. ## Pseudo Code ``` diff --git a/Algorithms/Sorting/RadixSort/readme.md b/Algorithms/Sorting/RadixSort/readme.md index 734b124f..1f2f8f25 100644 --- a/Algorithms/Sorting/RadixSort/readme.md +++ b/Algorithms/Sorting/RadixSort/readme.md @@ -4,13 +4,13 @@ Radix sort is a sorting algorithm that sorts the elements by first grouping the Suppose, we have an array of 8 elements. First, we will sort elements based on the value of the unit place. Then, we will sort elements based on the value of the tenth place. This process goes on until the last significant place. ## Complexity | Best | Average | Worst | Memory | Stable | -|------|---------|-------|--------|--------| -| O(d*(n+b)) | O(d*(n+b)) | O(d*(n+b)) | O(b + n) | Yes | +|:------:|:---------:|:-------:|:--------:|:--------:| +| $O(d*(n+b))$ | $O(d*(n+b))$ | $O(d*(n+b))$ | $O(b + n)$ | Yes | Where: -* n = the number of elements to sort -* k = the maximum key length (number of digit places) of the elements to sort -* b = the base (for example, for the decimal system, b is 10) +* $n$ = the number of elements to sort +* $k$ = the maximum key length (number of digit places) of the elements to sort +* $b$ = the base (for example, for the decimal system, $b$ is 10) ## Algorithm ``` diff --git a/Algorithms/Sorting/SelectionSort/readme.md b/Algorithms/Sorting/SelectionSort/readme.md index 0ef82c90..95242274 100644 --- a/Algorithms/Sorting/SelectionSort/readme.md +++ b/Algorithms/Sorting/SelectionSort/readme.md @@ -1,15 +1,15 @@ # Selection Sort -Selection Sort is a sorting algorithm, specifically an in-place comparison sort. It has O(n2) time complexity, making it inefficient on large lists, and generally performs worse than the similar insertion sort. Selection sort is noted for its simplicity, and it has performance advantages over more complicated algorithms in certain situations, particularly where auxiliary memory is limited. +Selection Sort is a sorting algorithm, specifically an in-place comparison sort. It has $O(n^2)$ time complexity, making it inefficient on large lists, and generally performs worse than the similar insertion sort. Selection sort is noted for its simplicity, and it has performance advantages over more complicated algorithms in certain situations, particularly where auxiliary memory is limited. ## Complexity -| Best | Average | Worst | Memory | Stable | -|------|---------|-------|--------|--------| -| n^2 | n^2 | n^2 | 1 | No | - -- The worst-case time complexity of heap sort is O(n^2). -- The best-case time complexity of heap sort is O(n^2). -- The average case time complexity of heap sort is O(n^2). -- The space complexity of heap sort is O(1). +| Best | Average | Worst | Memory | Stable | +|:--------:|:---------:|:-------:|:--------:|:--------:| +| $O(n^2)$ | $O(n^2)$ | $O(n^2)$ | $O(1)$ | No | + +- The worst-case time complexity of Selection Sort is O($n^2$). +- The best-case time complexity of Selection Sort is O($n^2$). +- The average case time complexity of Selection Sort is O($n^2$). +- The space complexity of Selection Sort is O(1). ## Pseudo Code ``` diff --git a/Algorithms/Sorting/ShellSort/readme.md b/Algorithms/Sorting/ShellSort/readme.md index a6e40501..44daa848 100644 --- a/Algorithms/Sorting/ShellSort/readme.md +++ b/Algorithms/Sorting/ShellSort/readme.md @@ -3,13 +3,13 @@ Shell sort is mainly a variation of Insertion Sort. In insertion sort, we move e ## Complexity | Best | Average | Worst | Memory | Stable | -|-------------|----------|-------|--------|--------| -| Ω(n log(n)) |θ(nlog(n)2)|O(n2)|1| No | +|:-------------:|:----------:|:-------:|:--------:|:--------:| +| $Ω(n*log(n))$ | $θ(n*log(n)^2)$ | $O(n^2)$ | $O(1)$ | No | -- The worst-case time complexity of heap sort is O(n^2). -- The best-case time complexity of heap sort is Ω(nlogn). -- The average case time complexity of heap sort is θ(nlogn). -- The space complexity of heap sort is O(1). +- The worst-case time complexity of Shell Sort is $O(n^2)$. +- The best-case time complexity of Shell Sort is $Ω(n*log(n))$. +- The average case time complexity of Shell Sort is $θ(n*log(n)^2)$. +- The space complexity of Shell Sort is $O(1)$. ## Pseudo Code ```Caculate gap size ($gap) diff --git a/Algorithms/Sorting/SwapSort/readme.md b/Algorithms/Sorting/SwapSort/readme.md index c256cac7..ea32fa18 100644 --- a/Algorithms/Sorting/SwapSort/readme.md +++ b/Algorithms/Sorting/SwapSort/readme.md @@ -1,15 +1,15 @@ # Swap Sort -Swap Sort is also a sorting algorithm, which is not known by many.It has O(n) time complexity.It works only on numbers [1 to N] and must not contain duplicates.You might be thinking what is the use of Swap Sort as there is Count Sort which time complexity is O(N) and can also sort data with dupliactes, but Swap sort is useful and efficient in solving many problems namely, find duplicate and missing, find all duplicates and missing from Array. +Swap Sort is also a sorting algorithm, which is not known by many.It has $O(n)$ time complexity.It works only on numbers [1 to n] and must not contain duplicates.You might be thinking what is the use of Swap Sort as there is Count Sort which time complexity is $O(n)$ and can also sort data with dupliactes, but Swap sort is useful and efficient in solving many problems namely, find duplicate and missing, find all duplicates and missing from Array. ## Complexity | Best | Average | Worst | Memory | Stable | -|------|---------|-------|--------|--------| -| n | n | n | 1 | No | +|:------:|:---------:|:-------:|:--------:|:--------:| +| $O(n)$ | $O(n)$ | $O(n)$ | $O(1)$ | No | -- The worst-case time complexity of heap sort is O(n). -- The best-case time complexity of heap sort is O(n). -- The average case time complexity of heap sort is O(n). -- The space complexity of heap sort is O(1). +- The worst-case time complexity of Swap Sort is $O(n)$. +- The best-case time complexity of Swap Sort is $O(n)$. +- The average case time complexity of Swap Sort is $O(n)$. +- The space complexity of Swap Sort is $O(1)$. ## Pseudo Code diff --git a/Algorithms/Sorting/TimSort/readme.md b/Algorithms/Sorting/TimSort/readme.md index 26faa7dc..1eb9eb78 100644 --- a/Algorithms/Sorting/TimSort/readme.md +++ b/Algorithms/Sorting/TimSort/readme.md @@ -5,8 +5,8 @@ TimSort is a sorting algorithm based on Insertion Sort and Merge Sort. It is use ## Complexity | Best | Average | Worst | Memory | Stable | -| ---- | -------- | -------- | ------ | ------ | -| n | n log(n) | n log(n) | n | Yes | +|:----:|:--------:|:--------:|:------:|:------:| +| $O(n)$ | $O(n*log(n))$ | $O(n*log(n))$ | $O(n)$ | Yes | ## Algorithm