old notes

algorithmns

#devstudy #algorithmns #datastructures #problemsolving

Read Before Starting

STOP SKIPPING STEPS EVEN ON EASY QUESTIONS!!!!!!!!!! YOU WILL REGRET IT EVERYTIME!!!!!!!!!!!!!

Most Common Mistakes

EFFY PLEASE VISUALIZE/WRITE MORE TEST CASES – misinterpreting the problem bc going too fast – misunderstanding the problem bc going too fast – implementation issues with scoping

If Stuck

  1. Can you reverse the interpretation?
  2. Can you start the approach with the another input and/or direction?
  3. Can you use a set or a map to store something you're manually reaching for at a later point?
  4. Can you break down the problem more?
  5. Did you make a table of cases to fully visualize the pattern or did you trust your assumptions? Don't trust your assumptions.

Interpretation Patterns

If input array is sorted then – Binary search – Two pointers

If asked for all permutations/subsets then – Backtracking

If given a tree then – DFS – BFS

If given a graph then – DFS – BFS

If given a linked list then – Two pointers

If recursion is banned then – Stack

If must solve in-place then – Swap corresponding values – Store one or more different values in the same pointer

If asked for maximum/minimum subarray/subset/options then – Dynamic programming

If asked for top/least K items then – Heap

If asked for common strings then – Map – Tree

Else – Map/Set for O(1) time & O(n) space – Sort input for O(nlogn) time and O(1) space

#devstudy #algorithmns #datastructures #problemsolving

  1. Can you reverse the interpretation?
  2. Can you start the approach with the another input and/or direction?
  3. Can you use a set or a map to store something you're manually reaching for at a later point?
  4. Can you break down the problem more?
  5. Did you make a table of cases to fully visualize the pattern or did you trust your assumptions? Don't trust your assumptions.

#datastructures #algorithmns #devstudy #BigONotation #timecomplexity #spacecomplexity

jarednielsen-big-o.png

Time Complexity: Time taken to run as a function of the length of the input

  • number of operations it takes to complete for length Input
  • find the WORST CASE scenario > drop the non-dominant terms

Not the most useful chart but that pastel-tone charcoal = yes

timecomplexity.png

Better chart but less aesthetic:

big-o-cheatsheet.png

From Fastest to Slowest:

O(1) time: One Operation

  • takes a constant amount of time even with more input
  • ex: mathematical calculation of sum: s = n(a+1)/2 as opposed iterative calculation which is O(n) linear bc for every additional input value, the rate of growth re: time scales proportionally.

Examples

  • accessing Array Index
  • Inserting a node in Linked List
  • Pushing and Popping on Stack
  • Insertion and Removal from Queue
  • Finding out the parent or left/right child of a node in a tree stored in Array
  • Jumping to Next/Previous element in Doubly Linked List

O(log n) time: Recursion?

Examples

  • Binary Search
  • Finding largest/smallest number in a binary search tree
  • Certain Divide and Conquer Algorithms based on Linear functionality
  • Calculating Fibonacci Numbers – Best Method premise = NOT using the complete data, and reducing the problem size with every iteration

O(n) time: Linear, i.e. Brute Force, Noob ones that I write bc my brain is stuck on iteration

Examples

  • Traversing an array
  • Traversing a linked list
  • Linear Search
  • Deletion of a specific element in a Linked List (Not sorted)
  • Comparing two strings
  • Checking for Palindrome
  • Counting/Bucket Sort and here too you can find a million more such examples....

O(n*log n) time: linear time complexity multiplied by log n

  • factor of log n is introduced by bringing into consideration Divide and Conquer.
  • some of the most optimized and frequently used

Examples

  • Merge Sort (recursive)
  • Heap Sort
  • Quick Sort
  • Certain Divide and Conquer Algorithms based on optimizing O(n^2) algorithms

O(n^2) time: Quadratic.

  • nested loops bc each loop is performing n iterations so n*n
  • less efficient algorithms if their O(n*logn) counterparts are present??
  • general application may be Brute Force here

Examples

  • Bubble Sort :'(
  • Insertion Sort
  • Selection Sort
  • Traversing a simple 2D array

Space Complexity: The space an algo needs to run

  • sorting algorithmns need at least O(n) space to save the list of length n that they have to sort but can often work in place meaning that it needs no additional space
  • number representation can be saved in either (1) binary (or any other base equal or less than 2) so this needs O(log n) space bc n = 2^logn; or (2) as a sum of n so you need O(n) space bc you need to save each n

O(1) space: In-place

  • a function that counts the elements of an array: don't need to allocate or copy new data even if the array is large, the counter is the same var

O(log n) space:

Examples

  • Binary Search
  • Finding largest/smallest number in a binary search tree
  • Certain Divide and Conquer Algorithms based on Linear functionality
  • Calculating Fibonacci Numbers – Best Method premise = NOT using the complete data, and reducing the problem size with every iteration

O(n) space:

  • everytime we make a new function call = new stack frame
  • recursion: if n = 100, then O(n)

#datastructures #recursion #devstudy #algorithmns

Characteristics of a Recursive Problem

  • Can the problem be broken down into a smaller version of itself?
  • Can I re-use the solution to solve the problem?
  • Do I have an abstracted solution that already exists??

THE PROCESS: DO NOT SKIP STEPS you will regret it

1. What is the simplest input? This is your base case. There might be more than one.

2. Visualize the Problem Without Words. Find the smallest subproblem and focus ONLY on the sub-problem. – do not think about the implementation beyond the subproblem

3. What is the relationship between the Second-To-Last to Last Case? If I am given the answer for the Second-To-Last case, can I find the Last Case? Can I solve?

4. Generalize the pattern: This is your recursive function.

5. Trust the abstraction. If the logic is sound, solving the sub problem will solve the problem.


MISTAKES I KEEP MAKING

  1. Breaking the Problem using syntax short cuts to express my thoughts and letting this draw me into assumption patterns without consciously realizing the pull.

What to do: Visualize the problem WITHOUT WORDS. Just do it.

  1. Thinking too much about the implementation in the rest of the problem instead of trusting that the recursion will work if the logic for the subproblem is sound.

What to do: Set a timer and spent 80% of time on thinking.

  1. Not spending enough time visualizing different examples.

What to do: See 1.

  1. Not trust that my logic IS sound and seeing all errors are evidence that there is a fundamental misunderstanding of the problem rather than realizing that sometimes, it's IS an implementation error.

What to do: Build more confidence by listening to positive feedback without immediately dismissing it. Spend more time coding so I know the idiosyncrasies of the language better.

  1. Implementation Errors: Going too fast to notice incorrect assumptions of what I am asking CREATOR to do.

What to do: More puts, debug line by line. Go slower....