🧠 Algorithms
Core Philosophy: Algorithms are the recipes for efficient computation. A senior engineer doesn’t just memorize them but understands the underlying paradigms to choose the right tool for the specific constraints of the system.
🌪️ Sorting
Section titled “🌪️ Sorting”Sorting is the foundation for many other optimizations (like binary search).
- Quick Sort: A divide-and-conquer approach. Excellent average-case performance $O(n \log n)$.
- Merge Sort: Stable and guarantees $O(n \log n)$ even in worst-case. Preferred for linked lists and when stability matters.
- Heap Sort: $O(n \log n)$ and performs sorting in-place, making it memory efficient.
🔍 Searching
Section titled “🔍 Searching”Finding data efficiently is the most common task in software engineering.
- Binary Search: The gold standard for sorted data ($O(\log n)$).
- Breadth-First Search (BFS): Explores neighbors first. Ideal for finding the shortest path in unweighted graphs.
- Depth-First Search (DFS): Explores branches deeply. Ideal for exhaustive searches and topological sorting.
💰 Greedy Algorithms
Section titled “💰 Greedy Algorithms”Greedy algorithms make the locally optimal choice at each step with the hope of finding a global optimum.
- Concept: They are fast and simple but don’t always yield the best solution.
- Use Cases:
- Dijkstra’s Algorithm: Finding the shortest path in a weighted graph.
- Huffman Coding: Data compression.
- Fractional Knapsack: When items can be broken into smaller pieces.
🧩 Dynamic Programming (DP)
Section titled “🧩 Dynamic Programming (DP)”DP is used to solve complex problems by breaking them down into simpler subproblems. It is the “senior” way to handle recursion by avoiding redundant calculations.
Key Strategies:
Section titled “Key Strategies:”- Memoization (Top-Down): Cache the results of expensive function calls and return the cached result when the same inputs occur again.
- Tabulation (Bottom-Up): Solve the smallest subproblems first and store their results in a table (usually an array or matrix).
Classic Problems:
Section titled “Classic Problems:”- 0/1 Knapsack: Making optimal choices with limited capacity.
- Longest Common Subsequence: Used in
difftools and bioinformatics. - Fibonacci/Path Finding: The entry point for understanding state transitions.
⚖️ Trade-offs & Decisions (The Senior Perspective)
Section titled “⚖️ Trade-offs & Decisions (The Senior Perspective)”| Strategy | When to Use | Trade-off |
|---|---|---|
| Greedy | When a “good enough” or local optimum is sufficient. | Might miss the global optimal solution. |
| Dynamic Programming | When subproblems overlap (e.g., calculating the same thing twice). | Higher memory usage (Space Complexity). |
| Recursion vs. Iteration | When the problem is naturally recursive (Trees). | Risk of StackOverflow in languages without Tail Call Optimization. |
💡 Seniority Note: Before implementing a complex DP solution, ask: “Is the dataset small enough that a simpler $O(n^2)$ approach is more maintainable?” Often, code readability and maintenance cost are more important than shaving off a few milliseconds if $n$ is consistently small.
🔗 Related Links
Section titled “🔗 Related Links”- [[Complexidade-Big-O]]
- [[Estruturas-de-Dados]]
- [[Design-Patterns]]