Skip to content

A collection of essential algorithms implemented in Golang

License

Notifications You must be signed in to change notification settings

xoticdsign/go-algorithms

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 

Repository files navigation

Go Algorithms Knowledge Base

GoAlgorithms is a curated collection of fundamental algorithms and data structures, all implemented in Golang. Whether you’re learning Golang or brushing up on computer science concepts, this repository serves as a comprehensive guide to common algorithms like sorting, searching, and more.

Understanding Complexity

Sorting Algorithms

Searching Algorithms

Algorithms are the backbone of efficient programming, and knowing how to implement them is key to becoming a skilled developer. With GoLogic, you’ll not only see how these algorithms are coded but also understand their principles and complexity in a clean and readable way. The repository includes detailed step-by-step explanations for every algorithm, making it beginner-friendly yet valuable for seasoned developers looking for Golang implementations.

Key Features of GoAlgorithms:

  • Sorting Algorithms: Explore Bubble Sort, Merge Sort, Quick Sort, and more.
  • Searching Algorithms: Dive into Linear and Binary Search.
  • Clear Explanations: Each algorithm is paired with comments and detailed explanations of its logic and complexity.
  • Real-world Examples: Practical examples to see each algorithm in action.

Try clicking on the title of each algorithm to visit it's own page with more information.

Understanding Complexity

Understanding time and space complexity is crucial for evaluating the efficiency of algorithms. Here’s a breakdown of both concepts to help you grasp them better:

  1. Time Complexity

Time complexity refers to the amount of time an algorithm takes to complete as a function of the size of the input. It’s essential because it helps us predict how an algorithm will perform as the input size grows.

Big-O Notation (O): This is the standard way to describe time complexity. It represents the worst-case scenario, i.e., the longest time an algorithm can take to run.

Common Time Complexities:

  • O(1) - Constant Time: The algorithm’s runtime is independent of the input size. Example: Accessing an array element by index.
  • O(log n) - Logarithmic Time: The algorithm reduces the problem size with each step. Example: Binary Search.
  • O(n) - Linear Time: The runtime increases proportionally with the input size. Example: Linear Search.
  • O(n log n): More efficient than quadratic time for sorting algorithms. Example: Merge Sort, Quick Sort.
  • O(n²) - Quadratic Time: Runtime increases by the square of the input size. Example: Bubble Sort, Selection Sort.
  • O(2^n) - Exponential Time: Time doubles with each additional input. Example: Solving the Traveling Salesman Problem using brute force.

Example: In a Bubble Sort, you have two nested loops that iterate through the array, so the time complexity is O(n²) because the number of comparisons grows quadratically with the input size.

  1. Space Complexity

Space complexity measures the amount of memory or storage an algorithm uses as a function of the input size. Efficient algorithms minimize memory usage.

Common Space Complexities:

  • O(1) - Constant Space: The algorithm uses a fixed amount of memory, regardless of input size. Example: Swapping two variables.
  • O(n) - Linear Space: Memory usage grows proportionally with input size. Example: Storing an additional array of size n.
  • O(log n): Common in algorithms that reduce the problem size logarithmically. Example: Recursive algorithms like Binary Search.

Example: A Merge Sort requires additional memory to store subarrays, so its space complexity is O(n), where n is the size of the input.

How to Analyze an Algorithm’s Complexity:

To determine time and space complexity, follow these steps:

  1. Count Loops and Recursive Calls: The number of iterations in loops or recursive calls often gives you the time complexity.
  • A single loop that runs n times = O(n).
  • Two nested loops each running n times = O(n²).
  1. Consider Function Calls:
  • If your algorithm calls another function repeatedly, understand the complexity of that function. For example, calling Merge Sort on two halves of an array gives you O(n log n) time complexity.
  1. Look at Memory Usage:
  • Analyze how much extra memory (variables, arrays, etc.) is needed. If the algorithm only uses a fixed amount of memory regardless of input size, it’s O(1). If it uses memory proportional to the input size, it’s O(n).

Selection Sort

Selection Sort is a simple and intuitive sorting algorithm. Imagine you’re organizing books on a shelf. Each time, you go through the remaining unsorted books to find the smallest (or largest, if sorting in descending order) and place it in its correct position. This process repeats until all the books are sorted.

  • Time complexity of O(n2)
  • Space complexity of O(1)

Algorithm example:

func SelectionSort(array []int) []int {
	for x := 0; x < len(array)-1; x++ {
		for y := x + 1; y < len(array); y++ {
			if array[y] < array[x] {
				array[y], array[x] = array[x], array[y]
			}
		}
	}

	return array
}

Insertion Sort

Insertion Sort is a straightforward sorting algorithm. The basic idea is simple: it builds a sorted list one item at a time. It’s not as fast as algorithms like Quicksort or Merge Sort when dealing with large datasets, but it shines when working with small or mostly sorted lists. Here’s a more intuitive way to think about it: imagine you’re sorting a deck of cards. You pick up one card at a time and place it in its correct position relative to the ones you’ve already sorted. This is exactly how Insertion Sort works—one element at a time, placing it where it belongs.

  • Time complexity of O(n2)
  • Space complexity of O(1)

Algorithm example:

func InsertionSort(array []int) []int {
	for x := 1; x < len(array); x++ {
		for y := 0; y < x; y++ {
			if array[y] > array[x] {
				array[y], array[x] = array[x], array[y]
			}
		}
	}

	return array
}

Bubble Sort

Bubble Sort is one of the simplest sorting algorithms out there. It works by repeatedly comparing adjacent elements in the array and swapping them if they are in the wrong order. This way, the largest unsorted element “bubbles up” to its correct position after each pass through the array. It’s easy to understand but not very efficient for large arrays. It has a time complexity of O(n²), meaning it becomes slow when sorting big lists. But for small lists or teaching purposes, Bubble Sort is great.

  • Time complexity of O(n2)
  • Space complexity of O(1)

Algorithm example:

func BubbleSort(array []int) []int {
	for x := 0; x < len(array)-1; x++ {
		for y := 0; y < len(array)-1-x; y++ {
			if array[y] > array[y+1] {
				array[y], array[y+1] = array[y+1], array[y]
			}
		}
	}

	return array
}

Merge Sort

Merge Sort is a famous divide-and-conquer algorithm used for sorting arrays. It works by splitting the array into smaller subarrays, sorting each subarray, and then merging them back together. It was invented by John von Neumann in 1945 and is efficient with a time complexity of O(n log n), making it much faster than simpler algorithms like Bubble Sort and Selection Sort for large datasets.

  • Time complexity of O(log n)
  • Space complexity of O(n)

Algorithm example:

func MergeSort(array []int) []int {
	if len(array) <= 1 {
		return array
	}

	middle := len(array) / 2
	left := MergeSort(array[:middle])
	right := MergeSort(array[middle:])

	return MergeHalves(left, right)
}

func MergeHalves(left []int, right []int) []int {
	sortedArray := make([]int, len(left)+len(right))
	x := 0
	y := 0

	for k := 0; k < len(sortedArray); k++ {
		switch {
		case x >= len(left):
			sortedArray[k] = right[y]
			y++

		case y >= len(right):
			sortedArray[k] = left[x]
			x++

		case left[x] < right[y]:
			sortedArray[k] = left[x]
			x++

		default:
			sortedArray[k] = right[y]
			y++
		}
	}

	return sortedArray
}

Quick Sort

Quick Sort is a highly efficient sorting algorithm, known for its divide-and-conquer approach. Invented by Tony Hoare in 1959, the algorithm works by selecting a “pivot” element from the array and partitioning the other elements into two sub-arrays based on whether they are less than or greater than the pivot. It recursively applies this process until the array is fully sorted.

  • Time complexity of O(n2)
  • Space complexity of O(log n)

Algorithm example:

func QuickSort(array []int, low int, high int) []int {
	if low >= high {
		return array
	}

	p := QuickPartition(array, low, high)
	QuickSort(array, low, p-1)
	QuickSort(array, p+1, high)

	return array
}

func QuickPartition(array []int, low int, high int) int {
	pivot := array[high]
	x := low - 1

	for y := low; y < high; y++ {
		if array[y] < pivot {
			x++
			array[x], array[y] = array[y], array[x]
		}
	}

	array[x+1], array[high] = array[high], array[x+1]

	return x + 1
}

Linear Search

Linear Search (or sequential search) is the simplest search algorithm that works by checking each element in a list sequentially until the desired element is found, or the list is fully traversed. It works equally well on both sorted and unsorted arrays but is inefficient for large datasets due to its linear time complexity.

  • Time complexity of O(n)
  • Space complexity of O(1)

Algorithm example:

func LinearSearch(array []int, toSearch int) (bool, int, int) {
	for x := 0; x <= len(array); x++ {
		if array[x] == toSearch {
			return true, x, array[x]
		}
	}

	return false, -1, -1
}

Binary Search

Binary Search is an efficient algorithm for finding a target value within a sorted array. It repeatedly divides the search interval in half. If the target value is less than the middle element, the search continues in the left half, otherwise, it continues in the right half. This approach significantly reduces the number of comparisons compared to a linear search.

  • Time complexity of O(log n)
  • Space complexity of O(1)

Algorithm example:

func BinarySearch(array []int, low int, high int, toSearch int) (bool, int, int) {
	for low <= high {
		mid := (low + high) / 2

		switch {
		case array[mid] == toSearch:
			return true, mid, array[mid]

		case array[mid] < toSearch:
			low = mid + 1

		default:
			high = mid - 1
		}
	}

	return false, -1, -1
}

License

MIT