[MUSIC] Now that we understand the basic operations of a heap, let's understand why we want to use a heap over other data structures, even though this looks remarkably similar to a binary tree. The main reason we want to use a heap is the idea that we can be really, really efficient about actually building a heap from the heap operations we have. So let's consider an example where we want to build a heap out of a string. So if we look at this example, I have the string BUILDHEAPNOW. And this string is simply an unordered array, just like we've seen before. And this BUILDHEAPNOW operation is just some characters that are not ordered whatsoever. And I have this BUILDHEAPNOW also listed out in our tree format right here. So in all of this, we can think of the three different ways that we might be able to do an insert. The first way we might be able to do an insert into our heap is we could simply sort the array. So if we have a sorted array where the smallest element's at the top and the largest element's at the end of our array, then we have a heap. By definition, if the smallest element's at index one, then it's going to be smaller than everything else later in the array. So this is good, but it runs in end login time. We don't have time to do a search if we really want to make an efficient algorithm. The second operation we can think about is what if we just called insert a number of times? So if we called insert every single time, we're going to have to perform a whole bunch of heapifyUp operations. In fact, we're going to do in operation that each take login time to do the tree operation. So this again, multiple calls to insert is going to cause an end log in runtime, again, an algorithm we just don't have time for. The last thing we can do is we can think of can we do better with an O of an operation? So thinking about these things, we can think about a third approach that involves combining two different ideas to create a heap out of a string. Or out of any unsorted lists that were brought in. So if we take a look at this example, we can see that the first thing is we can sort it. Second thing we can do is do a buildHeap by using heapifyUp. The third example is really, really clever. Let's think about only using the heapifyDown operation. So by only using heapifyDown operation, what we're doing is we're saying that it doesn't actually matter what's in our very last level of the tree. Because the very last level of the tree means that everything here is already balanced. And it has the minimum heap property. And in fact, it's not just the very last level, but it's the very last level and any nodes that don't have children. So this includes the node E here in the tree, as well. So this is everything up to the parent of the very last node. So only the first half of the nodes in the array, so only things in the first half of the array, needs to actually have their key property restored. because remember when heapifyDown does. heapifyDown takes a node and make sure that that node is placed down inside of our tree, in the proper location. In doing so, the first node we have to look at is H. Because looking at the node H means that we need to ensure that H and the subtree that's rooted at H is a heap. In this case, it is indeed a heap already. Then we can simply look at D. At D, we say is D smaller than both N and O? And only if D is both smaller than both N and O is the heap properly maintained. Here it is also maintained. We can go back to the previous slide and kind of keep looking at this. We can compare L with A and P. Here we can go ahead and swap A with L. Notice now we've maintained the heap property doing this in the array, as well. We swap A here. L goes down here. And at this point, we know that everything in the bottom two levels, everything here is a correct heap. So what we were able to do is we were able to just do three operations, three heapifyDown operations. To ensure at this point in time in the tree that we have a cracked heap. Next, we only need to go and finish up the very last three elements. And in finishing up those last three elements, we're able to perform a series of just N over 2 heapifyDown operations. And we know that after heapifying down that everything underneath our current node is actually already balanced. So because we're doing just N over two operations, we know that if we're doing N over two operations and the amount of work that we're doing per operation is capped. So that we're not doing any more than about constant time work at each operations, we know that we can actually build a heap in O of N time. So the key reason that we're actually care about building heaps, and that we care about the heap algorithm. Is the fact that, given any sort of data structure, any array representation, we can build a heap notation of that in just O of N time. Far shorter than sorting the array, far shorter than inserting into a binary tree. So that means as we're thinking about algorithms, if we need to know the minimum series of elements in a list. We can do so with just an O of N operation. So we'll dive into the analysis of this and how each of these operations perform and why we care about the heap in the next video. So I'll see you there.