From a00340bc962804dd2bfae06772385810736b2ebd Mon Sep 17 00:00:00 2001 From: aheaton22 <117932053+aheaton22@users.noreply.github.com> Date: Sat, 25 Nov 2023 08:53:30 -0700 Subject: [PATCH] Algorithms page update (#727) * checked all links, stubbed out new sections * added all algorithms to main page, started to fix and verify links * working on links * fixed graph links * finished cleaning up links * updated section explanations * fixed links in algorithm subpages * header and spelling fixes --- Algorithms/Backtracking/README.md | 19 +- Algorithms/Backtracking/Word Boggle/README.md | 2 +- Algorithms/Branch and Bound/README.md | 10 +- Algorithms/Dynamic Programming/README.md | 15 +- Algorithms/Graphs/readme.md | 24 ++- Algorithms/Greedy Algorithm/readme.md | 8 +- Algorithms/Searching/readme.md | 5 +- Algorithms/Sliding Window Algorithm/readme.md | 2 +- Algorithms/Sorting/readme.md | 9 +- Algorithms/readme.md | 202 ++++++++++++------ 10 files changed, 194 insertions(+), 102 deletions(-) diff --git a/Algorithms/Backtracking/README.md b/Algorithms/Backtracking/README.md index ade9ea7c..a70948a0 100644 --- a/Algorithms/Backtracking/README.md +++ b/Algorithms/Backtracking/README.md @@ -4,17 +4,18 @@ Backtracking is an algorithmic-technique for solving problems recursively by try ## Some Popular Backtracking Algorithms - +* [Sudoku](Sudoku/readme.md) +* [M Coloring](M%20Colouring%20Problem/readme.md) +* [Sum of Subset](Subset%20Sum/README.md) +* [Knight's Tour](The%20Knight’s%20tour%20problem/README.md) +* [Word Boggle](Word%20Boggle/README.md) * [Rat in a Maze](Rat%20in%20a%20Maze/README.md) -* [The Knight’s tour problem](The%20Knight’s%20tour%20problem/README.md) * [N Queen Problem](N%20Queen%20Problem/README.md) -* [M Coloring Problem](m%20colouring/README.md) -* [Sudoku](Sudoku/readme.md) * [Cryptarithmetic Puzzles](Cryptarithmetic%20Puzzles/README.md) -* [Subset Sum Algorithm](Subset%20Sum/README.md) -* [Tug of War](Tug%20of%20War/README.md) -* [Word Boggle](Word%20Boggle/README.md) -* [Permutation of a Given String](Permutation%20Of%20a%20Given%20String/README.md) -* [Generating IP address](Generate%20IP%20Adresses/README.md) +* [All Possible Paths](Find%20All%20Possible%20Path/README.md) +* [Generate IP Addresses](Generate%20IP%20Adresses/README.md) * [Remove Invalid Parentheses](Remove%20Invalid%20Parentheses/README.md) * [Longest Possible Route in a Matrix with Hurdles](Longest%20Possible%20Route%20in%20a%20Matrix%20with%20Hurdles/README.md) +* [Permutation of a Given String](Permutation%20Of%20a%20Given%20String/README.md) +* [Magnet Puzzle](Magnet%20Puzzle/README.md) +* [Tug of War](Tug%20of%20War/README.md) diff --git a/Algorithms/Backtracking/Word Boggle/README.md b/Algorithms/Backtracking/Word Boggle/README.md index db680c4f..ddedca67 100644 --- a/Algorithms/Backtracking/Word Boggle/README.md +++ b/Algorithms/Backtracking/Word Boggle/README.md @@ -1,4 +1,4 @@ -## Largest number in K swaps +## Word Boggle Given a dictionary of distinct words and an M x N board where every cell has one character. Find all possible words from the dictionary that can be formed by a sequence of adjacent characters on the board. We can move to any of 8 adjacent characters diff --git a/Algorithms/Branch and Bound/README.md b/Algorithms/Branch and Bound/README.md index 515b300a..85bd9b62 100644 --- a/Algorithms/Branch and Bound/README.md +++ b/Algorithms/Branch and Bound/README.md @@ -10,14 +10,14 @@ The general idea of Branch and Bound algorithm is a BFS-like search for the opti - 0/1 knapsack problem - Job Assignment Problem -- Travelling Salesman Problem -- Nearest Neighbour search +- Traveling Salesman Problem +- Nearest Neighbor search - N-Queens Problem ## Popular Dynamic Programming Algorithms * [0/1 knapsack problem](Knapsack%20Problem/README.md) * [Job Assignment Problem](Job%20Assignment%20Problem/README.md) -* [Travelling Salesman Problem](Not-Added) -* [Nearest Neighbour search](Not-Added) -* [N-Queens](Not-Added) +* [Traveling Salesman Problem](../Traveling%20Salesman%20Problem/readme.md) +* Nearest Neighbor search +* [N-Queens](../Backtracking/N%20Queen%20Problem/README.md) diff --git a/Algorithms/Dynamic Programming/README.md b/Algorithms/Dynamic Programming/README.md index 03edcf8b..7e1685b4 100644 --- a/Algorithms/Dynamic Programming/README.md +++ b/Algorithms/Dynamic Programming/README.md @@ -22,14 +22,21 @@ Dynamic Programming algorithm is designed using the following four steps − ### Popular Dynamic Programming Algorithms * [Fibonacci Sequence](Fibonacci%20Sequence/README.md) +* [Nth Fibonacci](Nth%20Fibonnaci/README.md) +* [Nth Catalan Number/Sequence](Nth%20Catalan%20Number/README.md) * [Longest Common Subsequence](Longest%20Common%20Subsequence/README.md) * [Longest Increasing Subsequence](Longest%20Increasing%20Subsequence/README.md) +* [Longest Common Substring](Longest%20Common%20Substring/readme.md) +* [Longest Palindromic Substring](Longest%20Palindromic%20Substring/README.md) * [Knapsack Problem](Knapsack%20Problem/README.md) * [Edit Distance](Edit%20Distance/README.md) -* [Coin Change](Coin%20Change/README.md) +* Coin Change * [Matrix Chain Multiplication](Matrix%20Chain%20Multiplication/README.md) +* [Balanced Tree Count](Count%20Balanced%20Binary%20Trees%20of%20Height%20h/readme.md) +* [Counting Hops](Count%20Number%20Of%20Hops/README.md) * [Floyd Warshall Algorithm](Floyd%20Warshall%20Algorithm/readme.md) -* [Subset Sum Problem](Subset%20Sum%20Problem/readme.md) +* [Gold Mine Problem](Gold%20Mine%20Problem/README.md) +* [Least Common Multiple (LCM)](LCM/LCM.md) * [Painting Fence Algorithm](Painting%20Fence%20Algorithm/readme.md) -* [Coin Change](Coin%20Change/readme.md) -* [Longest Common Substring](Longest%20Common%20Substring/readme.md) +* [Staircase](Staircase/README.md) +* [Subset Sum Problem](Subset%20Sum%20Problem/readme.md) \ No newline at end of file diff --git a/Algorithms/Graphs/readme.md b/Algorithms/Graphs/readme.md index 1c188720..e8c8ef0c 100644 --- a/Algorithms/Graphs/readme.md +++ b/Algorithms/Graphs/readme.md @@ -3,7 +3,7 @@ A Graph is a non-linear data structure consisting of vertices and edges. The ver ## Components of a Graph -- **[Vertices](#verticies)** - A vertex is a node of the graph. It can be denoted by any symbol such as V, U, X, Y, etc. A vertex may also be referred to as a node or a point. +- **[Vertices](#vertices)** - A vertex is a node of the graph. It can be denoted by any symbol such as V, U, X, Y, etc. A vertex may also be referred to as a node or a point. - **[Edges](#edges)** - An edge is a connection between two nodes. It can be denoted by any symbol such as E, F, G, H, etc. An edge may also be referred to as a link or a line. - **[Weight](#weight)** - A weight is a value associated with an edge. It can be denoted by any symbol such as W, X, Y, Z, etc. A weight may also be referred to as a cost. @@ -102,18 +102,22 @@ Pic Will be added ## Graph Topics 1. [Graph Traversal](Traversal%20Algorithms/readme.md) - [Breadth First Search](Traversal%20Algorithms/BreadthFirstSearch/readme.md) - - [Depth First Search](Traversal%20Algorithms/DepthFirstSearch/readme.md)] -2. [Cycle Detection](Cycle%20Detection/readme.md) - - [Undirected Graph](Cycle%20Detection/Undirected%20Graph/readme.md) + - [A* Search](Traversal%20Algorithms/AstarSearch/readme.md) + - [Depth First Search](Traversal%20Algorithms/DepthFirstSearch/readme.md) +2. [Topological Sorts](Topological%20Sort/readme.md) +3. [Cycle Detection](Cycle%20Detection/readme.md) + - Undirected Graph - [DFS](Cycle%20Detection/Undirected%20Graph/DFS/readme.md) - [BFS](Cycle%20Detection/Undirected%20Graph/BFS/readme.md) - [Directed Graph](Cycle%20Detection/Directed%20Graph/readme.md) - - [DFS](notadded) - - [BFS](notadded) -3. [Shortest Path](Shortest%20Path/readme.md) +4. [Shortest Path](Shortest%20Path/readme.md) - [Dijkstra's Algorithm](Traversal%20Algorithms/Dijkstra'sAlgorithm/readme.md) - [Bellman Ford Algorithm](Shortest%20Path/BellmanFordAlgorithm/readme.md) + - [Bellman-Ford Algorithm](Bellman-Ford%20Algorithm/readme.md) - [Floyd Warshall Algorithm](Shortest%20Path/FloydWarshallAlgorithm/readme.md) -4. [Spanning Tree Algorithm](Spanning%20Tree%20Algorithm/readme.md) - - [KruskalsAlgorithm](Spanning%20Tree%20Algorithm\KruskalsAlgorithm/readme.md) - - [PrimsAlgorithm](Spanning%20Tree%20Algorithm\PrimsAlgorithm/readme.md)] \ No newline at end of file +5. [Spanning Tree Algorithm](Spanning%20Tree%20Algorithm/readme.md) + - [KruskalsAlgorithm](Spanning%20Tree%20Algorithm/KruskalsAlgorithm/readme.md) + - [PrimsAlgorithm](Spanning%20Tree%20Algorithm/PrimsAlgorithm/readme.md) + - [Disjoint Set Union / Union find](DSU/readme.md) +6. Strongly Connected Components + - [Kosaraju's Algorithm](Kosaraju%20Algorithm/readme.md) \ No newline at end of file diff --git a/Algorithms/Greedy Algorithm/readme.md b/Algorithms/Greedy Algorithm/readme.md index 1f7bc47c..03cb80b9 100644 --- a/Algorithms/Greedy Algorithm/readme.md +++ b/Algorithms/Greedy Algorithm/readme.md @@ -2,11 +2,11 @@ Greedy algorithms are a simple, intuitive class of algorithms that can be used to find the optimal solution to some optimization problems. They are called greedy because at each step they make the choice that seems best at that moment. This means that greedy algorithms do not guarantee to return the globally optimal solution, but instead make locally optimal choices in the hope of finding a global optimum. Greedy algorithms are used for optimization problems. An optimization problem can be solved using Greedy if the problem has the following property: at every step, we can make a choice that looks best at the moment, and we get the optimal solution to the complete problem. ## Popular Greedy Algorithms -* [Activity Selection](Not-added) +* Activity Selection * [Huffman Coding](Huffman%20Coding%20Algorithm/readme.md) * [Prim's Algorithm](Prim%27s%20Algorithm/readme.md) * [Kruskal's Algorithm](Krushkal%27s%20Algorithm/readme.md) * [Dijkstra's Algorithm](Dijkstra%27s%20Algorithm/readme.md) -* [Job Sequencing](not-added) -* [Fractional Knapsack](not-added) -* [Huffman Decoding](not-added) +* Job Sequencing +* Fractional Knapsack +* [Ford-Fulkerson Algorithm](Ford-Fulkerson/readme.md) diff --git a/Algorithms/Searching/readme.md b/Algorithms/Searching/readme.md index 1ae70748..d2d92f6a 100644 --- a/Algorithms/Searching/readme.md +++ b/Algorithms/Searching/readme.md @@ -3,6 +3,9 @@ A search algorithm is the step-by-step procedure used to locate specific data am * [Sequential / Linear Search](./SequentialSearch/readme.md) +* [Linear Search](./LinearSearch/readme.md) * [Binary Search](./BinarySearch/readme.md) * [Hashing](./Hashing/readme.md) - +* Jump Search +* Interpolation Search +* Exponential Search diff --git a/Algorithms/Sliding Window Algorithm/readme.md b/Algorithms/Sliding Window Algorithm/readme.md index 0098543a..f28ea6cb 100644 --- a/Algorithms/Sliding Window Algorithm/readme.md +++ b/Algorithms/Sliding Window Algorithm/readme.md @@ -19,7 +19,7 @@ Sliding window can be used on fixed-length windows or variable-length windows. Eg. Find the largest subarray in a given array with sum equal to k. -You will understand more cleary when we will solve the questions by both the methods. +You will understand more clearly when we will solve the questions by both the methods. ### Popular Sliding Window Algorithms diff --git a/Algorithms/Sorting/readme.md b/Algorithms/Sorting/readme.md index e5de2570..6e26eb6d 100644 --- a/Algorithms/Sorting/readme.md +++ b/Algorithms/Sorting/readme.md @@ -8,6 +8,7 @@ the study of algorithms. There are several reasons: Sometimes an application inherently needs to sort information. For example, in order to prepare customer statements, banks need to sort checks by check number. + Algorithms often use sorting as a key subroutine. For example, a program that renders graphical objects which are layered on top of each other might have to sort the objects according to an “above” relation so that it can draw these @@ -18,16 +19,15 @@ ploy a rich set of techniques. In fact, many important techniques used through- out algorithm design appear in the body of sorting algorithms that have been developed over the years. In this way, sorting is also a problem of historical interest. + We can prove a nontrivial lower bound for sorting (as we shall do in Chapter 8). Our best upper bounds match the lower bound asymptotically, and so we know that our sorting algorithms are asymptotically optimal. Moreover, we can use the lower bound for sorting to prove lower bounds for certain other problems. -Many engineering issues come to the fore when implementing sorting algo- -rithms. The fastest sorting program for a particular situation may depend on +Many engineering issues come to the fore when implementing sorting algorithms. The fastest sorting program for a particular situation may depend on many factors, such as prior knowledge about the keys and satellite data, the memory hierarchy (caches and virtual memory) of the host computer, and the -software environment. Many of these issues are best dealt with at the algorith- -mic level, rather than by “tweaking” the code. +software environment. Many of these issues are best dealt with at the algorithmic level, rather than by “tweaking” the code. ### Popular Sorting Algorithms - [Bubble Sort](BubbleSort/readme.md) @@ -41,4 +41,3 @@ mic level, rather than by “tweaking” the code. - [Shell Sort](ShellSort/readme.md) - [Count Sort](CountSort/readme.md) - [Tim Sort](TimSort/readme.md) - diff --git a/Algorithms/readme.md b/Algorithms/readme.md index b94c349e..0064fee6 100644 --- a/Algorithms/readme.md +++ b/Algorithms/readme.md @@ -1,7 +1,17 @@ # Algorithms -Algorithms are the sets of steps necessary to complete computation - they are at the heart of what our devices actually do. And this isn’t a new concept. Since the development of math itself algorithms have been needed to help us complete tasks more efficiently, but today we’re going to take a look a couple modern computing problems like sorting and graph search, and show how we’ve made them more efficient so you can more easily find cheap airfare or map directions to Winterfell... or like a restaurant or something. +Algorithms are the sets of steps necessary to complete computation - they are at the heart of what our devices do. And this isn’t a new concept. Since the development of math itself algorithms have been needed to help us complete tasks more efficiently, but today we’re going to look a couple modern computing problems like sorting and graph search and show how we’ve made them more efficient so you can more easily find cheap airfare or map directions to Winterfell... or like a restaurant or something. -Put simply, Algorithm is a step-by-step procedure, which defines a set of instructions to be executed in a certain order to get the desired output. +Basically, an Algorithm is a step-by-step procedure, which defines a set of instructions to be executed in a certain order to get the desired output. + +## [Time Complexity](Time%20Complexity/readme.md) +The time complexity of an algorithm is an estimate of how much time it will take for an algorithm to run for a selected input. In other words, it describes how the run time of an algorithm grows as the input size grows. By calculating the time complexity, we can find out whether the algorithm is fast enough without implementing it. Normally written as O Notation but Ω and Θ Notation are also used. An algorithm's time complexity can range from constant to factorial. + +## [Space Complexity](Space%20Complexity/readme.md) +The space complexity of an algorithm refers to the total amount of memory space used by the algorithm. It’s the space of the input values and the space used while it is executed. Normally written as O Notation but Ω and Θ Notation are also used. An algorithm's space complexity can range from constant to factorial but is normally closer to the input size. + +## Algorithm Design Techniques +When creating algorithms there are a few techniques that can be used to reduce the time complexity of an algorithm. This allows for larger inputs to be calculated at faster times. +* [Sliding Window](Sliding%20Window%20Algorithm/readme.md) ## [Sorting](Sorting/readme.md) Sorting is the process of arranging a list of items in a particular order. For example, if you had a list of names, you might want to sort them alphabetically. Or if you had a list of numbers, you might want to put them in order from smallest to largest. Sorting is a common task, and it’s one that we can do in many different ways. @@ -16,84 +26,152 @@ Sorting is the process of arranging a list of items in a particular order. For e * [Radix Sort](Sorting/RadixSort/readme.md) * [Shell Sort](Sorting/ShellSort/readme.md) * [Count Sort](Sorting/CountSort/readme.md) +* [Tim Sort](Sorting/TimSort/readme.md) + -### Popular Searching Algorithms -* [Sequential/Linear Search](Searching/SequentialSearch/readme.md) +## [Searching](Searching/readme.md) +Searching is the process of finding a certain target element inside a container. Searching Algorithms are designed to check for an element or retrieve an element from any data structure where it is stored. +### Popular Searching Algorithms +* [Sequential](Searching/SequentialSearch/readme.md) +* [Linear Search](Searching/LinearSearch/readme.md) * [Binary Search](Searching/BinarySearch/readme.md) +* [Hashing](Searching/Hashing/readme.md) +* Jump Search +* Interpolation Search +* Exponential Search + + +## [Graph Algorithms](Graphs/readme.md) +A graph is a data structure that consists of a finite (and possibly changeable) set of vertices, which are also called nodes or points, e.g. V=(A, B, C, ..). To represent relations between vertices we have a set of unordered pairs called edges e.g. E= ((A,B), (A,C), ..). These edges are known as arcs, lines, and arrows. More formally a Graph is defined as a set of vertices( V ), a set of edges( E ) and is denoted by G(E, V). +Graphs are used to model pairwise relations between objects and are the most useful data structures for many real-world applications. For example, the airline route network is a graph in which the cities are the vertices, and the flight routes are the edges. Graphs are also used to represent networks. The Internet can be modeled as a graph in which the computers are the vertices and the links between computers are the edges. Graphs are also used in social networks like linkedIn and Facebook. In fact, graphs are used to represent circuit design, aeronautical scheduling, and many other things. + + +### Graph Components +- Edges - An edge is a connection between two nodes. +- Weight - A weight is a value associated with an edge. +- Vertices - A vertex is a node of the graph. This node also has a degree, or the number of edges connected to it. A vertex’s in-degree is defined as the number of edges that point to a vertex and the vertex’s out-degree is the number of edges that point to other vertices. There can be many different types of vertices such as: + - Isolated vertex - Has no edges pointing to the vertex, and it has no outgoing edges. Its in-degree and out-degree is zero. + - Source vertex - Has no edges point to the vertex, its in-degree is zero. + - Sink vertex - Has no outgoing edges, it’s out-degree is zero. + - [Articulation Points](Graphs/Articulation%20Points/readme.md) - A vertex that creates an undirected graph if it is removed from the graph.. + +### Types of Graphs +- Undirected Graph - A graph where edges of the graph are two-way paths or relations. +- Directed Graph - A graph where edges of the graph go only one way, usually marked with arrows. +- Weighted Graph - A graph where edges of the graph have costs or weights associated with them. +- [Tree Graphs](Graphs/Tree%20Based%20Algorithms/readme.md) - A graph with n vertices and n-1 edges where there exists only one path between vertices. + +### Graph Topics and Algorithms +1. [Graph Traversal](Graphs/Traversal%20Algorithms/readme.md) + - [Breadth First Search (BFS)](Graphs/Traversal%20Algorithms/BreadthFirstSearch/readme.md) + - [A* Search](Graphs/Traversal%20Algorithms/AstarSearch/readme.md) + - [Depth First Search (DFS)](Graphs/Traversal%20Algorithms/DepthFirstSearch/readme.md) +2. [Topological Sorts](Graphs/Topological%20Sort/readme.md) +3. [Cycle Detection](Graphs/Cycle%20Detection/readme.md) + - Undirected Graph + - [Using DFS](Graphs/Cycle%20Detection/Undirected%20Graph/DFS/readme.md) + - [Using BFS](Graphs/Cycle%20Detection/Undirected%20Graph/BFS/readme.md) + - [Directed Graph](Graphs/Cycle%20Detection/Directed%20Graph/readme.md) +4. [Shortest Path](Graphs/Shortest%20Path/readme.md) + - [Dijkstra's Algorithm](Graphs/Traversal%20Algorithms/Dijkstra'sAlgorithm/readme.md) + - [Bellman Ford Algorithm](Graphs/Shortest%20Path/BellmanFordAlgorithm/readme.md) + - [Bellman-Ford Algorithm](Graphs/Bellman-Ford%20Algorithm/readme.md) + - [Floyd Warshall Algorithm](Graphs/Shortest%20Path/FloydWarshallAlgorithm/readme.md) +5. [Spanning Tree Algorithm](Graphs/Spanning%20Tree%20Algorithm/readme.md) + - [Kruskals Algorithm](Graphs/Spanning%20Tree%20Algorithm/KruskalsAlgorithm/readme.md) + - [Prims Algorithm](Graphs/Spanning%20Tree%20Algorithm/PrimsAlgorithm/readme.md) + - [Disjoint Set Union / Union find](Graphs/DSU/readme.md) +6. Strongly Connected Components + - [Kosaraju's Algorithm](Graphs/Kosaraju%20Algorithm/readme.md) + + +## [Greedy Algorithms](Greedy%20Algorithm/readme.md) +Greedy algorithms are a simple, intuitive class of algorithms that can be used to find the optimal solution to some optimization problems. An optimization problem can be solved using Greedy if the problem has the following property: at every step, we can make a choice that looks best at the moment, and we get the optimal solution of the complete problem. They are called greedy because at each step they make the choice that seems best at that moment. This means that greedy algorithms do not guarantee to return the globally optimal solution, but instead make locally optimal choices in the hope of finding a global optimum. + +### Greedy Algorithms +* Activity Selection +* [Huffman Coding](Greedy%20Algorithm/Huffman%20Coding%20Algorithm/readme.md) +* [Prim's Algorithm](Greedy%20Algorithm/Prim%27s%20Algorithm/readme.md) +* [Kruskal's Algorithm](Greedy%20Algorithm/Krushkal%27s%20Algorithm/readme.md) +* [Dijkstra's Algorithm](Greedy%20Algorithm/Dijkstra's%20Algorithm/readme.md) +* Job Sequencing +* Fractional Knapsack +* [Ford-Fulkerson Algorithm](Greedy%20Algorithm/Ford-Fulkerson/readme.md) + +## [String Based Algorithms](String%20Based%20Algorithms/readme.md) +String-based algorithms are algorithms limited to strings. They are used for operations like searching for a specific substring, pattern matching, and other text processing. + +### Popular String Algorithms +* [Knuth-Morris-Pratt(KMP)](String%20Based%20Algorithms/KMP/readme.md) +* [Rabin Karp](String%20Based%20Algorithms/RabinKarp/readme.md) +* [Suffix Trie](String%20Based%20Algorithms/SuffixTrie/readme.md) +* Boyer-Moore Algorithm +* String Hashing -## [Graph Search](Graph%20Search/readme.md) -Graph search is the process of searching through a graph to find a particular node. A graph is a data structure that consists of a finite (and possibly mutable) set of vertices or nodes or points, together with a set of unordered pairs of these vertices for an undirected graph or a set of ordered pairs for a directed graph. These pairs are known as edges, arcs, or lines for an undirected graph and as arrows, directed edges, directed arcs, or directed lines for a directed graph. The vertices may be part of the graph structure, or may be external entities represented by integer indices or references. Graphs are one of the most useful data structures for many real-world applications. Graphs are used to model pairwise relations between objects. For example, the airline route network is a graph in which the cities are the vertices and the flight routes are the edges. Graphs are also used to represent networks. The Internet can be modeled as a graph in which the computers are the vertices and the links between computers are the edges. Graphs are also used in social networks like linkedIn, Facebook. In fact, graphs are used to represent many real-world applications: computer networks, circuit design, and aeronautical scheduling to name just a few. -### Popular Graph Search Algorithms -* [Breadth First Search](Graph%20Search/BreadthFirstSearch/readme.md) -* [Depth First Search](Graph%20Search/DepthFirstSearch/readme.md) -* [Dijkstra's Algorithm](Graph%20Search/Dijkstra'sAlgorithm/readme.md) -* [A* Search](Graph%20Search/A*Search/readme.md) ## [Dynamic Programming](Dynamic%20Programming/README.md) -Dynamic programming is both a mathematical optimization method and a computer programming method. The method was developed by Richard Bellman in the 1950s and has found applications in numerous fields, from aerospace engineering to economics. In both contexts it refers to simplifying a complicated problem by breaking it down into simpler sub-problems in a recursive manner. While some decision problems cannot be taken apart this way, decisions that span several points in time do often break apart recursively. Likewise, in computer science, if a problem can be solved optimally by breaking it into sub-problems and then recursively finding the optimal solutions to the sub-problems, then it is said to have optimal substructure. Dynamic programming is one way to solve problems with these properties. The process of breaking a complicated problem down into simpler sub-problems is called "divide and conquer". -### Popular Dynamic Programming Algorithms +Dynamic programming is both a mathematical optimization method and a computer programming method. It was developed by Richard Bellman in the 1950s and has found applications in numerous fields, from aerospace engineering to economics. In both contexts it refers to simplifying a complicated problem by breaking it down into simpler sub-problems in a recursive manner. While some decision problems cannot be taken apart this way, decisions that span several points in time do often break apart recursively. Likewise, in computer science, if a problem can be solved optimally by breaking it into sub-problems and then recursively finding the optimal solutions to the sub-problems, then it is said to have optimal substructure. Dynamic programming is one way to solve problems with these properties. The process of breaking a complicated problem down into simpler sub-problems is called "divide and conquer". + +### Dynamic Programming Algorithms * [Fibonacci Sequence](Dynamic%20Programming/Fibonacci%20Sequence/README.md) +* [Nth Fibonacci](Dynamic%20Programming/Nth%20Fibonnaci/README.md) +* [Nth Catalan Number/Sequence](Dynamic%20Programming/Nth%20Catalan%20Number/README.md) * [Longest Common Subsequence](Dynamic%20Programming/Longest%20Common%20Subsequence/README.md) * [Longest Increasing Subsequence](Dynamic%20Programming/Longest%20Increasing%20Subsequence/README.md) +* [Longest Common Substring](Dynamic%20Programming/Longest%20Common%20Substring/readme.md) +* [Longest Palindromic Substring](Dynamic%20Programming/Longest%20Palindromic%20Substring/README.md) * [Knapsack Problem](Dynamic%20Programming/Knapsack%20Problem/README.md) -* [Edit Distance](Not-Added) -* [Coin Change](Dynamic%20Programming/Coin%20Change/README.md) +* [Edit Distance](Dynamic%20Programming/Edit%20Distance/README.md) +* Coin Change * [Matrix Chain Multiplication](Dynamic%20Programming/Matrix%20Chain%20Multiplication/README.md) +* [Balanced Tree Count](Dynamic%20Programming/Count%20Balanced%20Binary%20Trees%20of%20Height%20h/readme.md) +* [Counting Hops](Dynamic%20Programming/Count%20Number%20Of%20Hops/README.md) +* [Floyd Warshall Algorithm](Dynamic%20Programming/Floyd%20Warshall%20Algorithm/readme.md) +* [Gold Mine Problem](Dynamic%20Programming/Gold%20Mine%20Problem/README.md) +* [Least Common Multiple (LCM)](Dynamic%20Programming/LCM/LCM.md) +* [Painting Fence Algorithm](Dynamic%20Programming/Painting%20Fence%20Algorithm/readme.md) +* [Staircase](Dynamic%20Programming/Staircase/README.md) +* [Subset Sum Problem](Dynamic%20Programming/Subset%20Sum%20Problem/readme.md) -## [Greedy Algorithms](Greedy%20Algorithm) -Greedy algorithms are a simple, intuitive class of algorithms that can be used to find the optimal solution to some optimization problems. They are called greedy because at each step they make the choice that seems best at that moment. This means that greedy algorithms do not guarantee to return the globally optimal solution, but instead make locally optimal choices in the hope of finding a global optimum. Greedy algorithms are used for optimization problems. An optimization problem can be solved using Greedy if the problem has the following property: at every step, we can make a choice that looks best at the moment, and we get the optimal solution of the complete problem. -### Popular Greedy Algorithms -* [Activity Selection](notadded) -* [Huffman Coding](Greedy%20Algorithm/Huffman%20Coding%20Algorithm/readme.md) -* [Prim's Algorithm](Greedy%20Algorithm/Prim%27s%20Algorithm/readme.md) -* [Kruskal's Algorithm](Greedy%20Algorithm/Krushkal%27s%20Algorithm/readme.md) -* [Dijkstra's Algorithm](notadded) -* [Job Sequencing](notadded) -* [Fractional Knapsack](notadded) -* [Huffman Decoding](notadded) -* [Ford-Fulkerson Algorithm](Greedy%20Algorithm/Ford-Fulkerson/readme.md) +## [Divide and Conquer](Divide%20and%20Conquer/readme.md) +Divide and conquer is an algorithmic paradigm in which the problem is recursively solved using the Divide, Conquer, and Combine strategy. +A problem is broken down into two or more similar sub-problems until they can be easily solved. Those solutions are then combined to solve the larger sub-problems until the original problem is solved. Divide and Conquer algorithms differ from Dynamic Programming algorithms in that Divide and Conquer algorithms do not have overlapping sub-problems. Meaning that all Divide and Conquer algorithms are also Dynamic Programming algorithms, but not all Dynamic Programming algorithms are Divide and Conquer algorithms. -## Backtracking -Backtracking is an algorithmic-technique for solving problems recursively by trying to build a solution incrementally, one piece at a time, removing those solutions that fail to satisfy the constraints of the problem at any point of time (by time, here, is referred to the time elapsed till reaching any level of the search tree). -### Popular Backtracking Algorithms -* [N-Queens](Backtracking/N-Queens/readme.md) -* [Sudoku](Backtracking/Sudoku/readme.md) -* [M Coloring](Backtracking/M%20Colouring%20Problem/readme.md) -* [Hamiltonian Cycle](Backtracking/HamiltonianCycle/readme.md) -* [Word Break](Backtracking/WordBreak/readme.md) -* [Rat in a Maze](Backtracking/RatinMaze/readme.md) -* [N Queen Problem](Backtracking/NQueenProblem/readme.md) -* [Sum of Subset](Backtracking/Subset%20Sum/README.md) -* [Solve Sudoku](Backtracking/SolveSudoku/readme.md) -* [Knight's Tour](Backtracking/The%20Knight’s%20tour%20problem/README.md) - -## Branch and Bound -Branch and bound is a general technique for solving combinatorial optimization problems. It is a systematic enumeration technique that reduces the number of candidate solutions by using the problem's structure to eliminate candidate solutions that cannot possibly be optimal. It is a divide and conquer algorithm that is used to solve optimization problems. It is a systematic enumeration technique that reduces the number of candidate solutions by using the problem's structure to eliminate candidate solutions that cannot possibly be optimal. It is a divide and conquer algorithm that is used to solve optimization problems. It is a systematic enumeration technique that reduces the number of candidate solutions by using the problem's structure to eliminate candidate solutions that cannot possibly be optimal. It is a divide and conquer algorithm that is used to solve optimization problems. -### Popular Branch and Bound Algorithms -* [0-1 Knapsack](BranchandBound/0-1Knapsack/readme.md) -* [Travelling Salesman Problem](BranchandBound/TravellingSalesmanProblem/readme.md) +### Popular Divide and Conquer Algorithms +* [Convex Hull Problem](Divide%20and%20Conquer/Convex%20Hull%20PRoblem/Readme.md) +* [The Inversion Problem](Divide%20and%20Conquer/Inversion%20Problem/readme.md) +* [Maximum and minimum of an array](Divide%20and%20Conquer/Maximum%20and%20minium%20of%20an%20array/readme.md) +* [Strassen's Matrix Multiplication](Divide%20and%20Conquer/Strassen's%20Algorithm/readme.md) -## [Searching](Searching/README.md) -Searching is algorithm for finding a certain target element inside a container. Searching Algorithms are designed to check for an element or retrieve an element from any data structure where it is stored. -### Popular Searching Algorithms revert-67-add-Vovka1759 -* [Linear Search](Searching/LinearSearch/readme.md) -* [Binary Search](Searching/BinarySearch/readme.md) -* [Jump Search](Not-Added) -* [Interpolation Search](Not-Added) -* [Exponential Search](Not-Added) +## [Branch and Bound](Branch%20and%20Bound/README.md) +Branch and bound is an algorithm technique for solving optimization problems. The problem is broken down by exploring potential solutions. Those solutions which cannot be the optimal solution are eliminated. When the entire tree of potential solutions has been explored the optimized solution is found. The branching of the solutions and the eliminating of non-optimal solutions is where the name branch and bound comes from. -## [Spanning Tree](Algorithms/Spanning%20Tree%20Algorithm/readme.md) +### Popular Branch and Bound Algorithms +* [Knapsack Problem](Branch%20and%20Bound/Knapsack%20Problem/README.md) +* [Traveling Salesman Problem](Traveling%20Salesman%20Problem/readme.md) +* [Job Assignment Problem](Branch%20and%20Bound/Job%20Assignment%20Problem/README.md) -A spanning tree is a sub-graph of an undirected connected graph, which includes all the vertices of the graph with a minimum possible number of edges. If a vertex is missed, then it is not a spanning tree. -The edges may or may not have weights assigned to them. +## [Backtracking Algorithms](Backtracking/README.md) +Backtracking is defined as searching every possible combination in order to solve a computational problem. This technique solves problems by building up solution candidates and when the algorithm determines that the candidate cannot possibly be part of a valid solution it abandons the candidate or “backtracks" to try another candidate. -The total number of spanning trees with n vertices that can be created from a complete graph is equal to n(n-2). +### Backtracking Algorithms +* [Sudoku](Backtracking/Sudoku/readme.md) +* [M Coloring](Backtracking/M%20Colouring%20Problem/readme.md) +* [Sum of Subset](Backtracking/Subset%20Sum/README.md) +* [Knight's Tour](Backtracking/The%20Knight’s%20tour%20problem/README.md) +* [Word Boggle](Backtracking/Word%20Boggle/README.md) +* [Rat in a Maze](Backtracking/Rat%20in%20a%20Maze/README.md) +* [N Queen Problem](Backtracking/N%20Queen%20Problem/README.md) +* [Cryptarithmetic Puzzles](Backtracking/Cryptarithmetic%20Puzzles/README.md) +* [All Possible Paths](Backtracking/Find%20All%20Possible%20Path/README.md) +* [Generate IP Addresses](Backtracking/Generate%20IP%20Adresses/README.md) +* [Remove Invalid Parentheses](Backtracking/Remove%20Invalid%20Parentheses/README.md) +* [Longest Possible Route in a Matrix with Hurdles](Backtracking/Longest%20Possible%20Route%20in%20a%20Matrix%20with%20Hurdles/README.md) +* [Permutation of a Given String](Backtracking/Permutation%20Of%20a%20Given%20String/README.md) +* [Magnet Puzzle](Backtracking/Magnet%20Puzzle/README.md) +* [Tug of War](Backtracking/Tug%20of%20War/README.md) -If we have n = 4, the maximum number of possible spanning trees is equal to 44-2 = 16. Thus, 16 spanning trees can be formed from a complete graph with 4 vertices. -### Popular Minimum Spanning tree Algorithms -* [KruskalsAlgorithm](Algorithms/Spanning%20Tree%20Algorithm\KruskalsAlgorithm/readme.md) -* [PrimsAlgorithm](Algorithms/Spanning%20Tree%20Algorithm\PrimsAlgorithm/readme.md)]