Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updated Sorting Algorithm descriptions by formatting formulas in mark… #770

Merged
merged 1 commit into from
Sep 20, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 6 additions & 6 deletions Algorithms/Sorting/BubbleSort/readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,13 +3,13 @@ Bubble Sort is the simplest sorting algorithm that works by repeatedly swapping

## Complexity
| Best | Average | Worst | Memory | Stable |
|------|---------|-------|--------|--------|
| n | n^2 | n^2 | 1 | Yes |
|:------:|:---------:|:-------:|:--------:|:--------:|
| $O(n)$ | $O(n^2)$ | $O(n^2)$ | $O(1)$ | Yes |

- The worst-case time complexity of bubble sort is O(n x n) = O(n^2)
- The best-case time complexity of bubble sort is O(n).
- The average case time complexity of bubble sort is O(n/2 x n) = O (n2).
- The space complexity of bubble sort is O(1).
- The worst-case time complexity of Bubble Sort is $O(n^2)$.
- The best-case time complexity of Bubble Sort is $O(n)$.
- The average case time complexity of Bubble Sort is $O(n^2)$.
- The space complexity of Bubble Sort is $O(1)$.

## Pseudo Code
```
Expand Down
12 changes: 6 additions & 6 deletions Algorithms/Sorting/CountSort/readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,13 +3,13 @@ Counting sort is a sorting algorithm that sorts the elements of an array by coun

## Complexity
| Best | Average | Worst | Memory | Stable |
|--------|---------|--------|--------|--------|
| O(n+k) | O(n+k) | O(n+k) | O(max) | Yes |
|:--------:|:---------:|:--------:|:--------:|:-------:|
| $O(n+k)$ | $O(n+k)$ | $O(n+k)$ | $O(n+k)$ | Yes |

- The worst-case time complexity of counting sort is O(n^k)
- The best-case time complexity of counting sort is O(n).
- The average case time complexity of counting sort is O(n+k)
- The space complexity of counting sort is O(k).
- The worst-case time complexity of Counting Sort is $O(n+k)$.
- The best-case time complexity of Counting Sort is $O(n+k)$.
- The average case time complexity of Counting Sort is $O(n+k)$.
- The space complexity of Counting Sort is $O(n+k)$.

## Pseudo Code
```
Expand Down
12 changes: 6 additions & 6 deletions Algorithms/Sorting/HeapSort/readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,13 +3,13 @@ Heap Sort is a comparison based sorting technique based on Binary Heap data stru

## Complexity
| Best | Average | Worst | Memory | Stable |
|------|---------|-------|--------|--------|
| nlogn | nlogn | nlogn | 1 | No |
|:------:|:---------:|:-------:|:--------:|:--------:|
| $O(n*log(n))$ | $O(n*log(n))$ | $O(n*log(n))$ | 1 | No |

- The worst-case time complexity of heap sort is O(nlogn).
- The best-case time complexity of heap sort is O(nlogn).
- The average case time complexity of heap sort is O(nlogn).
- The space complexity of heap sort is O(1).
- The worst-case time complexity of Heap Sort is $O(n*log(n))$.
- The best-case time complexity of Heap Sort is $O(n*log(n))$.
- The average case time complexity of Heap Sort is $O(n*log(n))$.
- The space complexity of Heap Sort is $O(1)$.

## Pseudo Code
```
Expand Down
4 changes: 2 additions & 2 deletions Algorithms/Sorting/InsertionSort/readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,8 +3,8 @@ Insertion Sort is a simple sorting algorithm that works the way we sort playing

## Complexity
| Best | Average | Worst | Memory | Stable |
|------|---------|-------|--------|--------|
| n | n^2 | n^2 | 1 | Yes |
|:------:|:---------:|:-------:|:--------:|:--------:|
| $O(n)$ | $O(n^2)$ | $O(n^2)$ | $O(1)$ | Yes |

## Pseudo Code
```
Expand Down
30 changes: 15 additions & 15 deletions Algorithms/Sorting/MergeSort/readme.md
Original file line number Diff line number Diff line change
@@ -1,33 +1,33 @@
# Merge Sort
Merge sort is a sorting algorithm based on the divide and conquer technique. With worst-case time complexity being Ο(n log n), it is one of the most important and commonly used algorithms.
Merge sort is a sorting algorithm based on the divide and conquer technique. With worst-case time complexity being $Ο(n*log(n))$, it is one of the most important and commonly used algorithms.


## Time and Space Complexity

| Best | Average | Worst | Memory | Stable |
|------|---------|-------|--------|--------|
| n log(n) | n log(n) | n log(n) | n | Yes |
|:------:|:---------:|:-------:|:--------:|:--------:|
| $O(n*log(n))$ | $O(n*log(n))$ | $O(n*log(n))$ | $O(n)$ | Yes |


1. Space Complexity
Auxiliary Space: O(n) Sorting In Place.
Auxiliary Space: $O(n)$ Sorting In Place.

2. Time Complexity
Merge Sort is a recursive algorithm and its time complexity can be expressed with the following recurrence relation for divide and conquer algorithms.
``
T(n) = aT(n/b) + f(n)

where,
n = size of input
a = 2, we divide our array in half and solve them first, so our number of subproblems is 2
n/b = n/2, we divide our array in half each time we recurse
f(n) = n, we still have to iterate though the list to check it and then combine the two halves together
- $T(n) = aT(n/b) + f(n)$

- where,
- $n$ = size of input
- $a = 2$, we divide our array in half and solve them first, so our number of subproblems is 2
- $n/b = n/2$, we divide our array in half each time we recurse
- $f(n) = n$, we still have to iterate though the list to check it and then combine the two halves together

T(n) = 2T(n/2) + n
``
The solution for the above recurrence is O(n log(n)).
The list of size n is divided into a max of log(n) parts, and the merging of all sublists into a single list takes O(n) time. Making the worst, best and average-case run time of this algorithm O(n log(n)).

- $T(n) = 2T(n/2) + n$

The solution for the above recurrence is $O(n*log(n))$.
The list of size $n$ is divided into a max of $log(n)$ parts, and the merging of all sublists into a single list takes $O(n)$ time. Making the worst, best and average-case run time of this algorithm $O(n*log(n))$.

## Pseudo Code
```
Expand Down
12 changes: 6 additions & 6 deletions Algorithms/Sorting/QuickSort/readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,13 +3,13 @@ Quick Sort is a Divide and Conquer algorithm. It picks an element as pivot and p

## Complexity
| Best | Average | Worst | Memory | Stable |
|------|---------|-------|--------|--------|
| nlog(n) | nlog(n) | n^2 | log(n) | No |
|:------:|:---------:|:-------:|:--------:|:--------:|
| $O(n*log(n))$ | $O(n*log(n))$ | $O(n^2)$ | $O(log(n))$ | No |

- The worst-case time complexity of heap sort is O(n^2).
- The best-case time complexity of heap sort is O(nlogn).
- The average case time complexity of heap sort is O(nlogn).
- The space complexity of heap sort is O(logn).
- The worst-case time complexity of Quick Sort is $O(n^2)$.
- The best-case time complexity of Quick Sort is $O(n*log(n))$.
- The average case time complexity of Quick Sort is $O(n*log(n))$.
- The space complexity of Quick Sort is $O(log(n))$.

## Pseudo Code
```
Expand Down
10 changes: 5 additions & 5 deletions Algorithms/Sorting/RadixSort/readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,13 +4,13 @@ Radix sort is a sorting algorithm that sorts the elements by first grouping the
Suppose, we have an array of 8 elements. First, we will sort elements based on the value of the unit place. Then, we will sort elements based on the value of the tenth place. This process goes on until the last significant place.
## Complexity
| Best | Average | Worst | Memory | Stable |
|------|---------|-------|--------|--------|
| O(d*(n+b)) | O(d*(n+b)) | O(d*(n+b)) | O(b + n) | Yes |
|:------:|:---------:|:-------:|:--------:|:--------:|
| $O(d*(n+b))$ | $O(d*(n+b))$ | $O(d*(n+b))$ | $O(b + n)$ | Yes |

Where:
* n = the number of elements to sort
* k = the maximum key length (number of digit places) of the elements to sort
* b = the base (for example, for the decimal system, b is 10)
* $n$ = the number of elements to sort
* $k$ = the maximum key length (number of digit places) of the elements to sort
* $b$ = the base (for example, for the decimal system, $b$ is 10)

## Algorithm
```
Expand Down
18 changes: 9 additions & 9 deletions Algorithms/Sorting/SelectionSort/readme.md
Original file line number Diff line number Diff line change
@@ -1,15 +1,15 @@
# Selection Sort
Selection Sort is a sorting algorithm, specifically an in-place comparison sort. It has O(n2) time complexity, making it inefficient on large lists, and generally performs worse than the similar insertion sort. Selection sort is noted for its simplicity, and it has performance advantages over more complicated algorithms in certain situations, particularly where auxiliary memory is limited.
Selection Sort is a sorting algorithm, specifically an in-place comparison sort. It has $O(n^2)$ time complexity, making it inefficient on large lists, and generally performs worse than the similar insertion sort. Selection sort is noted for its simplicity, and it has performance advantages over more complicated algorithms in certain situations, particularly where auxiliary memory is limited.

## Complexity
| Best | Average | Worst | Memory | Stable |
|------|---------|-------|--------|--------|
| n^2 | n^2 | n^2 | 1 | No |

- The worst-case time complexity of heap sort is O(n^2).
- The best-case time complexity of heap sort is O(n^2).
- The average case time complexity of heap sort is O(n^2).
- The space complexity of heap sort is O(1).
| Best | Average | Worst | Memory | Stable |
|:--------:|:---------:|:-------:|:--------:|:--------:|
| $O(n^2)$ | $O(n^2)$ | $O(n^2)$ | $O(1)$ | No |

- The worst-case time complexity of Selection Sort is O($n^2$).
- The best-case time complexity of Selection Sort is O($n^2$).
- The average case time complexity of Selection Sort is O($n^2$).
- The space complexity of Selection Sort is O(1).

## Pseudo Code
```
Expand Down
12 changes: 6 additions & 6 deletions Algorithms/Sorting/ShellSort/readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,13 +3,13 @@ Shell sort is mainly a variation of Insertion Sort. In insertion sort, we move e

## Complexity
| Best | Average | Worst | Memory | Stable |
|-------------|----------|-------|--------|--------|
| Ω(n log(n)) |θ(nlog(n)2)|O(n2)|1| No |
|:-------------:|:----------:|:-------:|:--------:|:--------:|
| $Ω(n*log(n))$ | $θ(n*log(n)^2)$ | $O(n^2)$ | $O(1)$ | No |

- The worst-case time complexity of heap sort is O(n^2).
- The best-case time complexity of heap sort is Ω(nlogn).
- The average case time complexity of heap sort is θ(nlogn).
- The space complexity of heap sort is O(1).
- The worst-case time complexity of Shell Sort is $O(n^2)$.
- The best-case time complexity of Shell Sort is $Ω(n*log(n))$.
- The average case time complexity of Shell Sort is $θ(n*log(n)^2)$.
- The space complexity of Shell Sort is $O(1)$.

## Pseudo Code
```Caculate gap size ($gap)
Expand Down
14 changes: 7 additions & 7 deletions Algorithms/Sorting/SwapSort/readme.md
Original file line number Diff line number Diff line change
@@ -1,15 +1,15 @@
# Swap Sort
Swap Sort is also a sorting algorithm, which is not known by many.It has O(n) time complexity.It works only on numbers [1 to N] and must not contain duplicates.You might be thinking what is the use of Swap Sort as there is Count Sort which time complexity is O(N) and can also sort data with dupliactes, but Swap sort is useful and efficient in solving many problems namely, find duplicate and missing, find all duplicates and missing from Array.
Swap Sort is also a sorting algorithm, which is not known by many.It has $O(n)$ time complexity.It works only on numbers [1 to n] and must not contain duplicates.You might be thinking what is the use of Swap Sort as there is Count Sort which time complexity is $O(n)$ and can also sort data with dupliactes, but Swap sort is useful and efficient in solving many problems namely, find duplicate and missing, find all duplicates and missing from Array.

## Complexity
| Best | Average | Worst | Memory | Stable |
|------|---------|-------|--------|--------|
| n | n | n | 1 | No |
|:------:|:---------:|:-------:|:--------:|:--------:|
| $O(n)$ | $O(n)$ | $O(n)$ | $O(1)$ | No |

- The worst-case time complexity of heap sort is O(n).
- The best-case time complexity of heap sort is O(n).
- The average case time complexity of heap sort is O(n).
- The space complexity of heap sort is O(1).
- The worst-case time complexity of Swap Sort is $O(n)$.
- The best-case time complexity of Swap Sort is $O(n)$.
- The average case time complexity of Swap Sort is $O(n)$.
- The space complexity of Swap Sort is $O(1)$.


## Pseudo Code
Expand Down
4 changes: 2 additions & 2 deletions Algorithms/Sorting/TimSort/readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,8 +5,8 @@ TimSort is a sorting algorithm based on Insertion Sort and Merge Sort. It is use
## Complexity

| Best | Average | Worst | Memory | Stable |
| ---- | -------- | -------- | ------ | ------ |
| n | n log(n) | n log(n) | n | Yes |
|:----:|:--------:|:--------:|:------:|:------:|
| $O(n)$ | $O(n*log(n))$ | $O(n*log(n))$ | $O(n)$ | Yes |

## Algorithm

Expand Down
Loading