You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
A commonly used [divide and conquer](#divide-and-conquer) sorting algorithm. In fact it's probably one of the best examples of divide and conquer there is. The fundamental procedure of merge sort is to split apart the input (dividing) and solve each of those parts individually (conquoring) and then building the solution back together. This is a key difference between divide and conquer and [decrease and conquer](#decrease-and-conquer) as the input is being *divided* into sub-problems that are **all** solved, instead of *decreasing* the problem by discarding parts that don't need to be solved.
- Use [Divide and Conquer](#divide-and-conquer) when you **==don't== have overlapping sub problems**
466
-
- Use [Dynamic Programming](#dynamic-programming) when you **==do== have overlapping sub problems**
466
+
- Use [Divide and Conquer](#divide-and-conquer) when you **don't have overlapping sub problems**
467
+
- Use [Decrease and Conquer](#decrease-and-conquer) when you can **discard/ignore parts of the input** entirely
468
+
- Use [Dynamic Programming](#dynamic-programming) when you **do have overlapping sub problems**
467
469
- Use [Backtracking](#backtracking) when you need to backtrack 😎
468
470
469
471
#### Brute Force
@@ -474,15 +476,12 @@ Checking every possible combination to see if it is valid as you go along. But i
474
476
475
477
#### Decrease and Conquer
476
478
477
-
!!! warning
478
-
479
-
**Decrease** and conquer decreases, and does not divide. Hence the time complexity reccurance of:
479
+
!!! summary
480
480
481
-
$$T(n) = T(n-b)$$ Not $$T(n) = T\left(\frac{n}{b}\right)$$
482
-
483
-
[Master theorem](#master-theorem)
481
+
Remember decrease and conquer as "**decreasing** the problem by discarding things I don't need, and conquoring the original problem with what I have left"
484
482
485
483
Decrease and Conquer algorithms make the problem smaller by reducing problem at each iteration. They can reduce the problem by
484
+
486
485
- Constant amount
487
486
- Variable amount
488
487
@@ -491,11 +490,13 @@ Decrease and Conquer algorithms make the problem smaller by reducing problem at
491
490
Efficient when the problem can be systematically decreased to a solution
492
491
493
492
**Steps for Decrease & Conquer:**
493
+
494
494
- Reduce original problem to a smaller instance of the same problem
495
495
- Solve the smaller instance
496
496
- Extend the solution of the smaller instance to get the solution to the overall problem
497
497
498
498
**Examples**:
499
+
499
500
-[BFS Breadth First Search](graphs.md#bfs-breadth-first-search)
500
501
-[DFS Depth First Search](graphs.md#dfs-depth-first-search)
501
502
@@ -505,15 +506,11 @@ Efficient when the problem can be systematically decreased to a solution
**Divide** and conquer divides, and does not staticly decrease. Hence the time complexity reccurance of:
511
-
512
-
$$T(n) = T\left(\frac{n}{b}\right)$$ Not $$T(n) = T(n-b)$$
513
-
514
-
[Master theorem](#master-theorem)
509
+
When the input size is divided into smaller parts and each smaller part is solved in isolation to build to a final solution.
515
510
516
-
When the input size is divided into smaller parts
511
+
!!! note
512
+
513
+
A good explanation of how divide and conquer is different to [decrease and conquer](#decrease-and-conquer) is available in the [merge sort](#merge-sort) section
517
514
518
515
#### Dynamic Programming
519
516
@@ -528,6 +525,7 @@ When the input size is divided into smaller parts
0 commit comments