Top 25 DSA Problems Every Placement-Season Student Should Solve (With Patterns, Not Just Links)


Top 25 DSA Problems Every Placement-Season Student Should Solve (With Patterns, Not Just Links)
By FinishDSA
Most "top DSA problems" lists are useless.
Not because the problems are wrong. Because they hand you a link and nothing else. "Here are 50 must-do LeetCode problems." Great. You go to problem 1, read it, get stuck, watch a solution video, understand the solution in the context of the video, move on to problem 2, and repeat.
Six weeks later you've done all 50 and learned almost nothing transferable because nobody told you why these problems matter. Nobody told you what pattern each problem represents. Nobody told you what to look for in the next problem that will tell you "this is the same shape."
This list is different.
For each problem, you'll get: what it is, what pattern it represents, the key insight that unlocks it, and a signal. The signal is what you should look for in other problems to know this is the same pattern.
Learn the pattern. Then use it on everything.
Before the list: one thing that will make this 3x more valuable
Don't look at the solution until you've spent at least 30 minutes genuinely trying.
Not 5 minutes. Not 10. Thirty.
The first 20 minutes of struggling with a problem you can't solve are not wasted time. They are the exact time your brain is forming the neural connections that will allow you to recognize this pattern the next time.
When you skip the struggle, you skip the learning. The solution video that follows is not a replacement for the struggle. It's just information without context.
Struggle first. Always.
The 25 problems
1. Two Sum
Pattern: Hashmap for complement lookup
You have an array and a target. Find two numbers that add up to the target.
The naive solution loops through every pair. O(n^2). Fine for small arrays, too slow in practice.
The insight: for every number x you see, you need target-x. If you could look up whether target-x exists in O(1) time, you'd solve this in one pass.
Hashmap gives you O(1) lookup. Store each number as you see it. For each new number, check if its complement already exists.
Signal for future problems: any time you're looking for a "complement" or "pair that satisfies a condition," reach for a hashmap first.
2. Best Time to Buy and Sell Stock
Pattern: Single pass with running minimum
One array of prices. Find the maximum profit from one buy-sell transaction.
The insight: the maximum profit on any day is that day's price minus the lowest price you've seen so far. Track the minimum as you walk through the array. Update the max profit at each step.
You're doing two things simultaneously in one loop: updating your best buying opportunity so far, and checking what today would give you if you sold.
Signal: when you need to track a "best so far" as you move through an array, this single-pass running tracking is almost always the answer.
3. Contains Duplicate
Pattern: Hashset for O(1) existence check
Given an array, return true if any value appears more than once.
This is the simplest possible hashset problem and it exists on this list not because it's hard, but because the pattern is everywhere.
Whenever you need to know "have I seen this before?" in O(1) time, the answer is a hashset.
Signal: anywhere "have I seen this" or "does this exist in what I've processed" appears as the core question.
4. Maximum Subarray (Kadane's Algorithm)
Pattern: Dynamic programming with O(1) space
Find the contiguous subarray with the maximum sum.
The insight: at each position, you have exactly two choices. You either extend the existing subarray (add the current element to your running sum), or you start fresh from the current element (because the previous running sum was negative and would only hurt you).
current_max = max(nums[i], current_max + nums[i])
That one line is Kadane's algorithm. The logic: if your accumulated sum so far is negative, you're better off starting over.
Signal: any problem where you're making a greedy choice at each step about whether to extend or reset a running calculation.
5. Product of Array Except Self
Pattern: Prefix and suffix products
Return an array where each element is the product of all other elements in the original array. The constraint: do it without division, in O(n).
The insight: for any position i, the answer is (product of everything to the left of i) times (product of everything to the right of i).
Build a prefix product array. Then build a suffix product array. Multiply them together.
Signal: any time you need "some aggregate of everything except this element" for each element. Usually prefix/suffix sums or products.
6. Valid Anagram
Pattern: Frequency counting with a hashmap or array
Two strings. Are they anagrams?
Count the frequency of every character in both strings. Compare.
This is almost too simple, but the frequency counting pattern is the foundation of problems 10x harder than this.
Signal: whenever you're comparing the composition of two strings or arrays rather than their specific order.
7. Group Anagrams
Pattern: Hashmap with sorted string as key
Group a list of strings into anagram clusters.
The insight: all anagrams of a word produce the same sorted string. Use sorted(word) as the key in a hashmap. Group all words under their canonical sorted key.
Signal: any time you need to group things by a common signature. The "signature" can be a sorted string, a tuple of character counts, or something else. The structure is always the same.
8. Valid Parentheses
Pattern: Stack for matching pairs
Given a string of brackets, are they valid?
The insight: when you see an opening bracket, push it. When you see a closing bracket, check if the top of the stack is its matching opener. If yes, pop. If no, invalid.
At the end, if the stack is empty, the string is valid.
Signal: any time you're matching or pairing things, and the most recently seen unmatched thing is the most relevant. Stack.
9. Binary Search
Pattern: Eliminate half the search space each step
Classic binary search on a sorted array.
left = 0, right = len(arr) - 1, while left <= right, mid = (left + right) // 2.
If arr[mid] is target: return mid. If too small: left = mid + 1. If too large: right = mid - 1.
This template will solve not just this problem but dozens of "search on a sorted space" variants.
Signal: whenever the search space is sorted or monotonic (always increasing or always decreasing with respect to your condition), binary search.
10. Search in Rotated Sorted Array
Pattern: Modified binary search on a partially sorted array
The same binary search template, but the array has been rotated. You need to figure out which half is sorted and use that to decide where to search.
At each mid, one of the two halves is always sorted. Figure out which one. Check if the target lies in the sorted half. If yes, search there. If no, search the other half.
Signal: any time binary search breaks because the array isn't cleanly sorted. Ask "which half is sorted?" and proceed from there.
11. 3Sum
Pattern: Sort + two pointers for triplet search
Find all unique triplets in an array that sum to zero.
The insight: sort the array. Fix one element by iterating through the array. For the remaining two elements, use two pointers (one from the current position forward, one from the end) to find pairs that sum to zero minus the fixed element.
Skip duplicates carefully to avoid repeating triplets.
Signal: any k-sum problem (3Sum, 4Sum) can be reduced to 2Sum using the "fix one element and recurse" pattern.
12. Container With Most Water
Pattern: Two pointers, greedy choice based on height
Given an array of heights, find the maximum area of water that can be trapped between two bars.
The insight: start with the widest container (pointer at both ends). The area is min(height[left], height[right]) times width. Now: the bottleneck is the shorter bar. If you move the pointer pointing to the shorter bar inward, you might find a taller bar. If you move the taller bar, you definitely won't improve it.
Always move the pointer with the smaller height.
Signal: any two-pointer problem where you're maximizing some value and need a greedy choice about which pointer to move.
13. Sliding Window Maximum
Pattern: Monotonic deque for window maximum in O(n)
Given an array and a window size k, return the maximum element in each window as it slides across the array.
The naive solution is O(nk). Too slow.
The insight: use a deque that maintains elements in decreasing order. When a new element comes in, pop from the back all elements smaller than it (they can never be the maximum while this new element is in the window). When the element at the front leaves the window, pop it.
Signal: any time you need the maximum or minimum of a sliding window efficiently. Monotonic deque.
14. Longest Substring Without Repeating Characters
Pattern: Sliding window with hashset
Find the length of the longest substring with all unique characters.
Two pointers. Expand right. When you hit a duplicate, shrink from the left until the duplicate is removed. Track the maximum window size.
This is the foundational sliding window problem. Master this one and you have the template for most sliding window variants.
Signal: "longest/shortest subarray/substring satisfying some property" is almost always a sliding window.
15. Climbing Stairs
Pattern: 1D dynamic programming
You can climb 1 or 2 steps. How many ways to reach step n?
This is Fibonacci in disguise. The number of ways to reach step n is the sum of ways to reach step n-1 and step n-2.
dp[n] = dp[n-1] + dp[n-2]
This is the simplest DP problem that exists. Do not underestimate it. The recursive relationship here is the exact mental model that powers every harder DP problem.
Signal: any time the answer for state n depends on smaller states, define the recursion, trust it, and add memoization.
16. House Robber
Pattern: 1D DP with a constraint between adjacent choices
Array of house values. You can't rob adjacent houses. Maximize loot.
At each house i, you have two choices: rob it (get nums[i] + dp[i-2]) or skip it (get dp[i-1]).
dp[i] = max(nums[i] + dp[i-2], dp[i-1])
Signal: any time you're making choices along a sequence where taking something at position i prevents you from taking something at position i+1.
17. Coin Change
Pattern: DP with unbounded choices, minimize steps
Given coin denominations and a target amount, find the minimum number of coins to make the amount.
For each amount from 1 to target, try every coin. If the coin is smaller than the amount, dp[amount] = min(dp[amount], dp[amount - coin] + 1).
Signal: any "minimum number of steps to reach a target" where you can use items multiple times. This unbounded knapsack pattern appears in many forms.
18. Number of Islands
Pattern: DFS/BFS on a grid
A 2D grid of 1s (land) and 0s (water). Count the number of islands (connected components of 1s).
For every unvisited 1, run DFS or BFS. Mark all reachable 1s as visited. Increment count by 1. Each DFS/BFS call discovers one complete island.
Signal: any time you need to count connected components in a grid or graph. DFS/BFS and mark-as-visited.
19. Clone Graph
Pattern: DFS/BFS with a hashmap to track cloned nodes
Deep clone an undirected graph.
The insight: maintain a hashmap from original node to cloned node. When you encounter a node you've already cloned, return the clone from the map instead of cloning again. This handles cycles.
Signal: any graph traversal where you need to track visited nodes to avoid infinite loops. Also the standard pattern for deep-copying any structure with shared references.
20. Course Schedule
Pattern: Cycle detection with DFS on directed graph
Given courses and prerequisites, determine if you can finish all courses (i.e., is there a cycle in the dependency graph?).
Build a directed graph. Run DFS. Track nodes currently in the recursion stack (visiting state). If you revisit a node that's in the current path, you've found a cycle.
Signal: any "can we order/schedule these tasks given dependencies?" question is topological sort or cycle detection.
21. Binary Tree Level Order Traversal
Pattern: BFS on a binary tree using a queue
Return all node values level by level.
Use a queue. Start with root. Process all nodes at the current level, adding their children to the queue for the next level.
This is the template for every "process level by level" tree problem.
Signal: any time the answer depends on the depth or level of nodes in a tree.
22. Maximum Depth of Binary Tree
Pattern: Recursive DFS, bottom-up
The maximum depth of a tree is 1 + max(depth of left subtree, depth of right subtree).
This is the simplest recursive tree problem. The pattern: trust that your recursive call returns the correct answer for the subtree, and build the answer for the current node from there.
Signal: any tree problem where the answer at a node depends on answers from its children. Trust the recursion.
23. Invert Binary Tree
Pattern: Recursive DFS, swap children
Swap the left and right child of every node.
Recursive: at each node, swap its left and right children. Then recursively invert both subtrees.
The insight is in the simplicity. This problem exists on this list because it forces you to trust that recursion works at each level. You don't need to think about the entire tree. Just this node and its immediate children.
Signal: any transformation that needs to happen at every node in a tree can be written this way.
24. Validate Binary Search Tree
Pattern: DFS with min/max range propagation
Check if a tree is a valid BST.
The naive check (each node's left child is smaller, right child is larger) fails. A node can be locally valid but globally invalid.
The correct insight: as you traverse down, propagate a valid range. Every node must be within its valid range. For a left child, the max valid value tightens. For a right child, the min valid value tightens.
Signal: any tree problem where a property depends on the node's ancestors as well as its local context.
25. Word Break
Pattern: DP on strings
Given a string and a dictionary of words, can the string be segmented into dictionary words?
For each position i in the string, check if any word in the dictionary ends at i and if the string before that word can also be segmented (which you've already computed).
dp[i] = any(dp[j] and s[j:i] in wordSet) for j in range(i)
Signal: any "can this string/sequence be decomposed into valid parts" problem. Check each possible last piece and rely on already-computed answers for everything before it.
What to do with these 25 problems
Do them in this order. Not random.
The list is sequenced: hashmap basics first (1-7), then stack and binary search (8-10), then two pointers and sliding window (11-14), then DP (15-17), then graphs and trees (18-25).
Each group builds on the mental models established by the previous group.
For each problem:
- Try it for 30 minutes before looking at anything
- After solving or after 30 minutes of full effort, write one sentence: "The key insight was ___"
- One week after solving it, come back and try to solve it again without looking at your previous solution
That last step is the one most people skip. It's also the most important one. The first solve is information. The second solve, a week later, is the moment you find out whether it's actually in your head.
After these 25, what?
These 25 problems are not the endpoint. They're the foundation.
Once you've done these properly, you have a working mental model for 9 of the 10 core patterns. From here, every new problem you solve is either reinforcing a pattern you already have or introducing a slight variation that sharpens it.
That's when problem solving starts to feel like recognition instead of mystery.
That's what you're working toward.
Stop grinding. Start finishing.
On FinishDSA, these 25 problems are part of a 250-problem adaptive roadmap organized by exactly these patterns. The AI tracks which patterns you're genuinely understanding and adjusts what you see next. Not based on what you've solved. Based on what you actually know.

