Conversation
anselrognlie
left a comment
There was a problem hiding this comment.
✨💫 Nice job, Lux. I left some comments on your implementation below.
🟢
| Time Complexity: O(log n) | ||
| Space Complexity: O(log n) due to recursive stack calls |
There was a problem hiding this comment.
✨ Great. You're exactly right that it's due to the recursive call in heap_up that the space complexity is O(log n). If heap_up were implemented iteratively, this would only require O(1) space complexity since the stack size wouldn't depend on the heap depth.
| Space Complexity: O(log n) due to recursive stack calls | ||
| """ | ||
| pass | ||
| if self.empty(): |
There was a problem hiding this comment.
✨ Nice use of your own helper method!
| Time Complexity: O(log n) | ||
| Space Complexity: O(log n) due to recursive stack calls |
There was a problem hiding this comment.
✨ Nice. Just as for add, you're right that the log space complexity remove is due to the recursive heap_down implementation. We could achieve O(1) space complexity if we used an iterative approach.
| Time complexity: O(1) | ||
| Space complexity: O(1) |
| Space complexity: O(1) | ||
| """ | ||
| pass | ||
| return len(self.store) == 0 |
There was a problem hiding this comment.
Remember that an empty list is falsy
return not self.store| Time complexity: O(1) | ||
| Space complexity: O(1) |
There was a problem hiding this comment.
👀 This is where the O(log n) time and space complexity in add comes from. Since heap_up calls itself recursively, the worst case will be when the new value needs to be moved all the way up the heap, which will have a height of log n. So both the time and space complexity (due to the stack growth) are O(log n). If we implemented this instead with an iterative approach, the space complexity would be O(1).
| self.swap(parent_index, index) | ||
| self.heap_up(parent_index) | ||
|
|
||
| def heap_down(self, index): |
There was a problem hiding this comment.
✨ Nice set of conditionals to narrow in on where to swap. Notice there's a little duplication since we call swap, heap_down in two places. We could try to fully determine which child we're going to swap with first, and then have a single code flow to swap and re-heapify.
Though not prompted, like heap_up, heap_down is also O(log n) in both time and space complexity. The worst case for re-heapifying is if the new root need to move back down to a leaf, and so the stack growth will be the height of the heap, which is log n. If we implemented this instead with an iterative approach, the space complexity would instead be O(1).
| Time Complexity: O(n log n) where n is the number of items in the list | ||
| Space Complexity: O(n) where n is the number of items in the list |
There was a problem hiding this comment.
✨ Great. Since sorting using a heap reduces down to building up a heap of n items one-by-one (each taking O(log n)), then pulling them back out again (again taking O(log n) for each of n items), we end up with a time complexity of O(2n log n) → O(n log n). While for the space, we do need to worry about the O(log n) space consume during each add and remove, but they aren't cumulative (each is consumed only during the call to add or remove). However, the internal store for the MinHeap does grow with the size of the input list. So the maximum space would be O(n + log n) → O(n), since n is a larger term than log n.
Note that a fully in-place solution (O(1) space complexity) would require both avoiding the recursive calls, as well as working directly with the originally provided list (no internal store).
| for i in range(len(list)): | ||
| list[i] = heap.remove() | ||
| return list |
There was a problem hiding this comment.
Note the since this isn't a fully in-place solution (the MinHeap has a O(n) internal store), we don't necessarily need to modify the passed in list. The tests are written to check the return value, so we could unpack the heap into a new result list to avoid mutating the input.
Also, in this situation, since we built the heap, we also "know" the number of items in the heap. So it's OK to iterate a fixed number of times. But if we were pulling things out of a heap more generally, we would want to make use of the empty helper as follows:
result = []
while not heap.empty():
result.append(heap.remove())
return result
Heaps Practice
Congratulations! You're submitting your assignment!
Comprehension Questions
heap_up&heap_downmethods useful? Why?