>

>
1x
go to beginning previous frame pause play next frame go to end

A Binary Search Tree (BST) is a binary tree in which each vertex has only up to 2 children that satisfies BST property: All vertices in the left subtree of a vertex must hold a value smaller than its own and all vertices in the right subtree of a vertex must hold a value larger than its own (we have assumption that all values are distinct integers in this visualization and small tweak is needed to cater for duplicates/non integer). Try clicking Search(7) for a sample animation on searching a random value ∈ [1..99] in the random BST above.


An Adelson-Velskii Landis (AVL) tree is a self-balancing BST that maintains it's height to be O(log N) when having N vertices in the AVL tree.


Remarks: By default, we show e-Lecture Mode for first time (or non logged-in) visitor.
If you are an NUS student and a repeat visitor, please login.

🕑

To toggle between the standard Binary Search Tree and the AVL Tree (only different behavior during Insertion and Removal of an Integer), select the respective header.


We also have URL shortcut to quickly access the AVL Tree mode, which is https://visualgo.net/en/avl (you can change the 'en' to your two characters preferred language - if available).


Pro-tip 1: Since you are not logged-in, you may be a first time visitor (or not an NUS student) who are not aware of the following keyboard shortcuts to navigate this e-Lecture mode: [PageDown]/[PageUp] to go to the next/previous slide, respectively, (and if the drop-down box is highlighted, you can also use [→ or ↓/← or ↑] to do the same),and [Esc] to toggle between this e-Lecture mode and exploration mode.

🕑

BST (and especially balanced BST like AVL Tree) is an efficient data structure to implement a certain kind of Table (or Map) Abstract Data Type (ADT).


A Table ADT must support at least the following three operations as efficient as possible:

  1. Search(v) — determine if v exists in the ADT or not,
  2. Insert(v) — insert v into the ADT,
  3. Remove(v) — remove v from the ADT.

Reference: See similar slide in Hash Table e-Lecture.


Pro-tip 2: We designed this visualization and this e-Lecture mode to look good on 1366x768 resolution or larger (typical modern laptop resolution in 2021). We recommend using Google Chrome to access VisuAlgo. Go to full screen mode (F11) to enjoy this setup. However, you can use zoom-in (Ctrl +) or zoom-out (Ctrl -) to calibrate this.

🕑

We are referring to Table ADT where the keys need to be ordered (as opposed to Table ADT where the keys do not need to be unordered).


This special requirement of Table ADT will be made clearer in the next few slides.


Pro-tip 3: Other than using the typical media UI at the bottom of the page, you can also control the animation playback using keyboard shortcuts (in Exploration Mode): Spacebar to play/pause/replay the animation, / to step the animation backwards/forwards, respectively, and -/+ to decrease/increase the animation speed, respectively.

🕑

If we use unsorted array/vector to implement Table ADT, it can be inefficient:

  1. Search(v) runs in O(N), as we may end up exploring all N elements of the ADT if v actually does not exist,
  2. Insert(v) can be implemented in O(1), just put v at the back of the array,
  3. Remove(v) runs in O(N) too as we have to first search for v which is already O(N) and later close the resulting gap after deletion — also in O(N).
🕑

If we use sorted array/vector to implement Table ADT, we can improve the Search(v) performance but weakens the Insert(v) performance:

  1. Search(v) can now be implemented in O(log N), as we can now use binary search on the sorted array,
  2. Insert(v) now runs in O(N) as we need to implement an insertion-sort like strategy to make the array remains sorted,
  3. Remove(v) runs in O(N) because even if Search(v) runs in O(log N), we still need to close the gap after deletion — which is in O(N).
🕑

The goal for this e-Lecture is to introduce BST and then balanced BST (AVL Tree) data structure so that we can implement the basic Table ADT operations: Search(v), Insert(v), Remove(v), and a few other Table ADT operations — see the next slide — in O(log N) time — which is much smaller than N.


PS: Some of the more experienced readers may notice that ∃ another data structure that can implement the three basic Table ADT operations in faster time, but read on...


N≈ 1 000≈ 1 000 000≈ 1 000 000 000
log N10Only 20 :OOnly 30 :O:O
🕑

On top of the basic three, there are a few other possible Table ADT operations:

  1. Find the Min()/Max() element,
  2. Find the Successor(v) — 'next larger'/Predecessor(v) — 'previous smaller' element,
  3. List elements in sorted order,
  4. Operation X & Y - hidden for pedagogical purpose in an NUS module,
  5. There are others possible operations.

Discussion: What are the best possible implementation for the first three additional operations if we are limited to use [sorted|unsorted] array/vector?

🕑

The content of this interesting slide (the answer of the usually intriguing discussion point from the earlier slide) is hidden and only available for legitimate CS lecturer worldwide. This mechanism is used in the various flipped classrooms in NUS.


If you are really a CS lecturer (or an IT teacher) (outside of NUS) and are interested to know the answers, please drop an email to stevenhalim at gmail dot com (show your University staff profile/relevant proof to Steven) for Steven to manually activate this CS lecturer-only feature for you.


FAQ: This feature will NOT be given to anyone else who is not a CS lecturer.

🕑

The simpler data structure that can be used to implement Table ADT is Linked List.


Quiz: Can we perform all basic three Table ADT operations: Search(v)/Insert(v)/Remove(v) efficiently (read: faster than O(N)) using Linked List?

Yes
No


Discussion: Why?

🕑

The content of this interesting slide (the answer of the usually intriguing discussion point from the earlier slide) is hidden and only available for legitimate CS lecturer worldwide. This mechanism is used in the various flipped classrooms in NUS.


If you are really a CS lecturer (or an IT teacher) (outside of NUS) and are interested to know the answers, please drop an email to stevenhalim at gmail dot com (show your University staff profile/relevant proof to Steven) for Steven to manually activate this CS lecturer-only feature for you.


FAQ: This feature will NOT be given to anyone else who is not a CS lecturer.

🕑

Another data structure that can be used to implement Table ADT is Hash Table. It has very fast Search(v), Insert(v), and Remove(v) performance (all in expected O(1) time).


Quiz: So what is the point of learning this BST module if Hash Table can do the crucial Table ADT operations in unlikely-to-be-beaten expected O(1) time?

There are valid reasons, which are ____
There is no point, so this BST module can be ignored


Discuss the answer above! Hint: Go back to the previous 4 slides ago.

🕑

The content of this interesting slide (the answer of the usually intriguing discussion point from the earlier slide) is hidden and only available for legitimate CS lecturer worldwide. This mechanism is used in the various flipped classrooms in NUS.


If you are really a CS lecturer (or an IT teacher) (outside of NUS) and are interested to know the answers, please drop an email to stevenhalim at gmail dot com (show your University staff profile/relevant proof to Steven) for Steven to manually activate this CS lecturer-only feature for you.


FAQ: This feature will NOT be given to anyone else who is not a CS lecturer.

🕑

We will now introduce BST data structure. See the visualization of an example BST above!


Root vertex does not have a parent. There can only be one root vertex in a BST. Leaf vertex does not have any child. There can be more than one leaf vertex in a BST. Vertices that are not leaf are called the internal vertices. Sometimes root vertex is not included as part of the definition of internal vertex as the root of a BST with only one vertex can actually fit into the definition of a leaf too.


In the example above, vertex 15 is the root vertex, vertex {5, 7, 50} are the leaves, vertex {4, 6, 15 (also the root), 23, 71} are the internal vertices.

🕑

Each vertex has at least 4 attributes: parent, left, right, key/value/data (there are potential other attributes). Not all attributes will be used for all vertices, e.g. the root vertex will have its parent attribute = NULL. Some other implementation separates key (for ordering of vertices in the BST) with the actual satellite data associated with the keys.


The left/right child of a vertex (except leaf) is drawn on the left/right and below of that vertex, respectively. The parent of a vertex (except root) is drawn above that vertex. The (integer) key of each vertex is drawn inside the circle that represent that vertex. In the example above, (key) 15 has 6 as its left child and 23 as its right child. Thus the parent of 6 (and 23) is 15.

🕑

As we do not allow duplicate integer in this visualization, the BST property is as follow: For every vertex X, all vertices on the left subtree of X are strictly smaller than X and all vertices on the right subtree of X are strictly greater than X.


In the example above, the vertices on the left subtree of the root 15: {4, 5, 6, 7} are all smaller than 15 and the vertices on the right subtree of the root 15: {23, 50, 71} are all greater than 15. You can recursively check BST property on other vertices too.


For more complete implementation, we should consider duplicate integers too. The easiest way to support this is to add one more attribute at each vertex: the frequency of occurrence of X (this visualization will be upgraded with this feature soon).

🕑

We provide visualization for the following common BST/AVL Tree operations:

  1. Query operations (the BST structure remains unchanged):
    1. Search(v),
    2. Predecessor(v) (and similarly Successor(v)), and
    3. Inorder/Preorder Traversal (we will add Postorder Traversal soon),
  2. Update operations (the BST structure may likely change):
    1. Insert(v),
    2. Remove(v), and
    3. Create BST (several criteria).
🕑

There are a few other BST (Query) operations that have not been visualized in VisuAlgo:

  1. Rank(v): Given a key v, determine what is its rank (1-based index) in the sorted order of the BST elements. That is, Rank(FindMin()) = 1 and Rank(FindMax()) = N. If v does not exist, we can report -1.
  2. Select(k): Given a rank k, 1 ≤ kN, determine the key v that has that rank k in the BST. Or in another word, find the k-th smallest element in the BST. That is, Select(1) = FindMin() and Select(N) = FindMax().

The details of these two operations are currently hidden for pedagogical purpose in a certain NUS module.

🕑

Data structure that is only efficient if there is no (or rare) update, especially the insert and/or remove operation(s) is called static data structure.


Data structure that is efficient even if there are many update operations is called dynamic data structure. BST and especially balanced BST (e.g. AVL Tree) are in this category.

🕑

Because of the way data (distinct integers for this visualization) is organised inside a BST, we can binary search for an integer v efficiently (hence the name of Binary Search Tree).


First, we set the current vertex = root and then check if the current vertex is smaller/equal/larger than integer v that we are searching for. We then go to the right subtree/stop/go the left subtree, respectively. We keep doing this until we either find the required vertex or we don't.


On the example BST above, try clicking Search(23) (found after 2 comparisons), Search(7) (found after 3 comparisons), Search(21) (not found after 2 comparisons — at this point we will realize that we cannot find 21).

🕑

Similarly, because of the way data is organised inside a BST, we can find the minimum/maximum element (an integer in this visualization) by starting from root and keep going to the left/right subtree, respectively.


Try clicking FindMin() and FindMax() on the example BST shown above. The answers should be 4 and 71 (both after comparing against 3 integers from root to leftmost vertex/rightmost vertex, respectively).

🕑

Search(v)/FindMin()/FindMax() operations run in O(h) where h is the height of the BST.


But note that this h can be as tall as O(N) in a normal BST as shown in the random 'skewed right' example above. Try Search(100) (this value should not exist as we only use random integers between [1..99] to generate this random BST and thus the Search routine should check all the way from root to the only leaf in O(N) time — not efficient.

🕑

Because of the BST properties, we can find the Successor of an integer v (assume that we already know where integer v is located from earlier call of Search(v)) as follows:

  1. If v has a right subtree, the minimum integer in the right subtree of v must be the successor of v. Try Successor(23) (should be 50).
  2. If v does not have a right subtree, we need to traverse the ancestor(s) of v until we find 'a right turn' to vertex w (or alternatively, until we find the first vertex w that is greater than vertex v). Once we find vertex w, we will see that vertex v is the maximum element in the left subtree of w. Try Successor(7) (should be 15).
  3. If v is the maximum integer in the BST, v does not have a successor. Try Successor(71) (should be none).
🕑

The operations for Predecessor of an integer v are defined similarly (just the mirror of Successor operations).


Try the same three corner cases (but mirrored): Predecessor(6) (should be 5), Predecessor(50) (should be 23), Predecessor(4) (should be none).


At this point, stop and ponder these three Successor(v)/Predecessor(v) cases to ensure that you understand these concepts.

🕑

Predecessor(v) and Successor(v) operations run in O(h) where h is the height of the BST.


But recall that this h can be as tall as O(N) in a normal BST as shown in the random 'skewed right' example above. If we call Successor(FindMax()), we will go up from that last leaf back to the root in O(N) time — not efficient.

🕑

We can perform an Inorder Traversal of this BST to obtain a list of sorted integers inside this BST (in fact, if we 'flatten' the BST into one line, we will see that the vertices are ordered from smallest/leftmost to largest/rightmost).


Inorder Traversal is a recursive method whereby we visit the left subtree first, exhausts all items in the left subtree, visit the current root, before exploring the right subtree and all items in the right subtree. Without further ado, let's try Inorder Traversal to see it in action on the example BST above.

🕑

Inorder Traversal runs in O(N), regardless of the height of the BST.


Discussion: Why?


PS: Some people call insertion of N unordered integers into a BST in O(N log N) and then performing the O(N) Inorder Traversal as 'BST sort'. It is rarely used though as there are several easier-to-use (comparison-based) sorting algorithms than this.

🕑

The content of this interesting slide (the answer of the usually intriguing discussion point from the earlier slide) is hidden and only available for legitimate CS lecturer worldwide. This mechanism is used in the various flipped classrooms in NUS.


If you are really a CS lecturer (or an IT teacher) (outside of NUS) and are interested to know the answers, please drop an email to stevenhalim at gmail dot com (show your University staff profile/relevant proof to Steven) for Steven to manually activate this CS lecturer-only feature for you.


FAQ: This feature will NOT be given to anyone else who is not a CS lecturer.

🕑

We have included the animation for Preorder but we have not do the same for Postorder tree traversal method.


Basically, in Preorder Traversal, we visit the current root before going to left subtree and then right subtree. For the example BST shown in the background, we have: {{15}, {6, 4, 5, 7}, {23, 71, 50}}. PS: Do you notice the recursive pattern? root, members of left subtree of root, members of right subtree of root.


In Postorder Traversal, we visit the left subtree and right subtree first, before visiting the current root. For the example BST shown in the background, we have: {{5, 4, 7, 6}, {50, 71, 23}, {15}}.

🕑

We can insert a new integer into BST by doing similar operation as Search(v). But this time, instead of reporting that the new integer is not found, we create a new vertex in the insertion point and put the new integer there. Try Insert(60) on the example above.

🕑

Insert(v) runs in O(h) where h is the height of the BST.


By now you should be aware that this h can be as tall as O(N) in a normal BST as shown in the random 'skewed right' example above. If we call Insert(FindMax()+1), i.e. we insert a new integer greater than the current max, we will go from root down to the last leaf and then insert the new integer as the right child of that last leaf in O(N) time — not efficient (note that we only allow up to h=9 in this visualization).

🕑

Quiz: Inserting integers [1,10,2,9,3,8,4,7,5,6] one by one in that order into an initially empty BST will result in a BST of height:

10
9
The height cannot be determined
8


Pro-tip: You can use the 'Exploration mode' to verify the answer.

🕑

We can remove an integer in BST by performing similar operation as Search(v).


If v is not found in the BST, we simply do nothing.


If v is found in the BST, we do not report that the existing integer v is found, but instead, we perform one of the three possible removal cases that will be elaborated in three separate slides (we suggest that you try each of them one by one).

🕑

The first case is the easiest: Vertex v is currently one of the leaf vertex of the BST.


Deletion of a leaf vertex is very easy: We just remove that leaf vertex — try Remove(5) on the example BST above (second click onwards after the first removal will do nothing — please refresh this page or go to another slide and return to this slide instead).


This part is clearly O(1) — on top of the earlier O(h) search-like effort.

🕑

The second case is also not that hard: Vertex v is an (internal/root) vertex of the BST and it has exactly one child. Removing v without doing anything else will disconnect the BST.


Deletion of a vertex with one child is not that hard: We connect that vertex's only child with that vertex's parent — try Remove(23) on the example BST above (second click onwards after the first removal will do nothing — please refresh this page or go to another slide and return to this slide instead).


This part is also clearly O(1) — on top of the earlier O(h) search-like effort.

🕑

The third case is the most complex among the three: Vertex v is an (internal/root) vertex of the BST and it has exactly two children. Removing v without doing anything else will disconnect the BST.


Deletion of a vertex with two children is as follow: We replace that vertex with its successor, and then delete its duplicated successor in its right subtree — try Remove(6) on the example BST above (second click onwards after the first removal will do nothing — please refresh this page or go to another slide and return to this slide instead).


This part requires O(h) due to the need to find the successor vertex — on top of the earlier O(h) search-like effort.

🕑

This case 3 warrants further discussions:

  1. Why replacing a vertex B that has two children with its successor C is always a valid strategy?
  2. Can we replace vertex B that has two children with its predecessor A instead? Why or why not?
🕑

The content of this interesting slide (the answer of the usually intriguing discussion point from the earlier slide) is hidden and only available for legitimate CS lecturer worldwide. This mechanism is used in the various flipped classrooms in NUS.


If you are really a CS lecturer (or an IT teacher) (outside of NUS) and are interested to know the answers, please drop an email to stevenhalim at gmail dot com (show your University staff profile/relevant proof to Steven) for Steven to manually activate this CS lecturer-only feature for you.


FAQ: This feature will NOT be given to anyone else who is not a CS lecturer.

🕑

Remove(v) runs in O(h) where h is the height of the BST. Removal case 3 (deletion of a vertex with two children is the 'heaviest' but it is not more than O(h)).


As you should have fully understand by now, h can be as tall as O(N) in a normal BST as shown in the random 'skewed right' example above. If we call Remove(FindMax()), i.e. we remove the current max integer, we will go from root down to the last leaf in O(N) time before removing it — not efficient.

🕑

To make life easier in 'Exploration Mode', you can create a new BST using these options:

  1. Empty BST (you can then insert a few integers one by one),
  2. The two e-Lecture Examples that you may have seen several times so far,
  3. Random BST (which is unlikely to be extremely tall),
  4. Skewed Left/Right BST (tall BST with N vertices and N-1 linked-list like edges, to showcase the worst case behavior of BST operations; disabled in AVL Tree mode).
🕑

We are midway through the explanation of this BST module. So far we notice that many basic Table ADT operations run in O(h) and h can be as tall as N-1 edges like the 'skewed left' example shown — inefficient :(...


So, is there a way to make our BSTs 'not that tall'?


PS: If you want to study how these basic BST operations are implemented in a real program, you can download this BSTDemo.cpp.

🕑

At this point, we encourage you to press [Esc] or click the X button on the bottom right of this e-Lecture slide to enter the 'Exploration Mode' and try various BST operations yourself to strengthen your understanding about this versatile data structure.


When you are ready to continue with the explanation of balanced BST (we use AVL Tree as our example), press [Esc] again or switch the mode back to 'e-Lecture Mode' from the top-right corner drop down menu. Then, use the slide selector drop down list to resume from this slide 12-1.

🕑

We have seen from earlier slides that most of our BST operations except Inorder traversal runs in O(h) where h is the height of the BST that can be as tall as N-1.


We will continue our discussion with the concept of balanced BST so that h = O(log N).

🕑

There are several known implementations of balanced BST, too many to be visualized and explained one by one in VisuAlgo.


We focus on AVL Tree (Adelson-Velskii & Landis, 1962) that is named after its inventor: Adelson-Velskii and Landis.


Other balanced BST implementations (more or less as good or slightly better in terms of constant-factor performance) are: Red-Black Tree, B-trees/2-3-4 Tree (Bayer & McCreight, 1972), Splay Tree (Sleator and Tarjan, 1985), Skip Lists (Pugh, 1989), Treaps (Seidel and Aragon, 1996), etc.

🕑

To facilitate AVL Tree implementation, we need to augment — add more information/attribute to — each BST vertex.


For each vertex v, we define height(v): The number of edges on the path from vertex v down to its deepest leaf. This attribute is saved in each vertex so we can access a vertex's height in O(1) without having to recompute it every time.

🕑

Formally:

v.height = -1 (if v is an empty tree)
v.height = max(v.left.height, v.right.height) + 1 (otherwise)
The height of the BST is thus: root.height.


On the example BST above, height(11) = height(32) = height(50) = height(72) = height(99) = 0 (all are leaves). height(29) = 1 as there is 1 edge connecting it to its only leaf 32.

🕑

Quiz: What are the values of height(20), height(65), and height(41) on the BST above?

height(20) = 2
height(20) = 3
height(41) = 4
height(65) = 2
height(65) = 3
height(41) = 3
🕑

If we have N elements/items/keys in our BST, the lower bound height h > log2 N if we can somehow insert the N elements in perfect order so that the BST is perfectly balanced.


See the example shown above for N = 15 (a perfect BST which is rarely achievable in real life — try inserting any other integer and it will not be perfect anymore).

🕑
N ≤ 1 + 2 + 4 + ... + 2h
N ≤ 20 + 21 + 22 + … + 2h
N < 2h+1 (sum of geometric progression)
log2 N < log2 2h+1
log2 N < (h+1) * log2 2 (log2 2 is 1)
h > (log2 N)-1 (algebraic manipulation)
h > log2 N
🕑

If we have N elements/items/keys in our BST, the upper bound height h < N if we insert the elements in ascending order (to get skewed right BST as shown above).


The height of such BST is h = N-1, so we have h < N.


Discussion: Do you know how to get skewed left BST instead?

🕑

The content of this interesting slide (the answer of the usually intriguing discussion point from the earlier slide) is hidden and only available for legitimate CS lecturer worldwide. This mechanism is used in the various flipped classrooms in NUS.


If you are really a CS lecturer (or an IT teacher) (outside of NUS) and are interested to know the answers, please drop an email to stevenhalim at gmail dot com (show your University staff profile/relevant proof to Steven) for Steven to manually activate this CS lecturer-only feature for you.


FAQ: This feature will NOT be given to anyone else who is not a CS lecturer.

🕑

We have seen that most BST operations are in O(h) and combining the lower and upper bounds of h, we have log2 N < h < N.


There is a dramatic difference between log2 N and N and we have seen from the discussion of the lower bound that getting perfect BST (at all times) is near impossible...


So can we have BST that has height closer to log2 N, i.e. c * log2 N, for a small constant factor c? If we can, then BST operations that run in O(h) actually run in O(log N)...

🕑

Introducing AVL Tree, invented by two Russian (Soviet) inventors: Georgy Adelson-Velskii and Evgenii Landis, back in 1962.


In AVL Tree, we will later see that its height h < 2 * log N (tighter analysis exist, but we will use easier analysis in VisuAlgo where c = 2). Therefore, most AVL Tree operations run in O(log N) time — efficient.


Insert(v) and Remove(v) update operations may change the height h of the AVL Tree, but we will see rotation operation(s) to maintain the AVL Tree height to be low.

🕑

To have efficient performance, we shall not maintain height(v) attribute via the O(N) recursive method every time there is an update (Insert(v)/Remove(v)) operation.


Instead, we compute O(1): x.height = max(x.left.height, x.right.height) + 1 at the back of our Insert(v)/Remove(v) operation as only the height of vertices along the insertion/removal path may be affected. Thus, only O(h) vertices may change its height(v) attribute and in AVL Tree, h < 2 * log N.


Try Insert(37) on the example AVL Tree (ignore the resulting rotation for now, we will come back to it in the next few slides). Notice that only a few vertices along the insertion path: {41,20,29,32} increases their height by +1 and all other vertices will have their heights unchanged.

🕑

Let's define the following important AVL Tree invariant (property that will never change): A vertex v is said to be height-balanced if |v.left.height - v.right.height| ≤ 1.


A BST is called height-balanced according to the invariant above if every vertex in the BST is height-balanced. Such BST is called AVL Tree, like the example shown above.


Take a moment to pause here and try inserting a few new random vertices or deleting a few random existing vertices. Will the resulting BST still considered height-balanced?

🕑

Adelson-Velskii and Landis claim that an AVL Tree (a height-balanced BST that satisfies AVL Tree invariant) with N vertices has height h < 2 * log2 N.


The proof relies on the concept of minimum-size AVL Tree of a certain height h.


Let Nh be the minimum number of vertices in a height-balanced AVL Tree of height h.


The first few values of Nh are N0 = 1 (a single root vertex), N1 = 2 (a root vertex with either one left child or one right child only), N2 = 4, N3 = 7, N4 = 12, N5 = 20 (see the background picture), and so on (see the next two slides).

🕑

We know that for any other AVL Tree of N vertices (not necessarily the minimum-size one), we have N ≥ Nh.


Proof-2

In the background picture, we have N5 = 20 vertices but we know that we can squeeze 43 more vertices (up to N = 63) before we have a perfect binary tree of height h = 5.

🕑
Nh = 1 + Nh-1 + Nh-2 (formula for minimum-size AVL tree of height h)
Nh > 1 + 2*Nh-2 (as Nh-1 > Nh-2)
Nh > 2*Nh-2 (obviously)
Nh > 4*Nh-4 (recursive)
Nh > 8*Nh-6 (another recursive step)
... (we can only do this h/2 times, assuming initial h is even)
Nh > 2h/2*N0 (we reach base case)
Nh > 2h/2 (as N0 = 1)
Proof-3
🕑
N ≥ Nh > 2h/2 (combining the previous two slides)
N > 2h/2
log2(N) > log2(2h/2) (log2 on both sides)
log2(N) > h/2 (formula simplification)
2 * log2(N) > h or h < 2 * log2(N)
h = O(log(N)) (the final conclusion)
Proof-4
🕑

Look at the example BST again. See that all vertices are height-balanced, an AVL Tree.


To quickly detect if a vertex v is height balanced or not, we modify the AVL Tree invariant (that has absolute function inside) into: bf(v) = v.left.height - v.right.height.


Now try Insert(37) on the example AVL Tree again. A few vertices along the insertion path: {41,20,29,32} increases their height by +1. Vertices {29,20} will no longer be height-balanced after this insertion (and will be rotated later — discussed in the next few slides), i.e. bf(29) = -2 and bf(20) = -2 too. We need to restore the balance.

🕑
Tree Rotation

See the picture above. Calling rotateRight(Q) on the left picture will produce the right picture. Calling rotateLeft(P) on the right picture will produce the left picture again.


rotateRight(T)/rotateLeft(T) can only be called if T has a left/right child, respectively.


Tree Rotation preserves BST property. Before rotation, P ≤ B ≤ Q. After rotation, notice that subtree rooted at B (if it exists) changes parent, but P ≤ B ≤ Q does not change.

🕑
BSTVertex rotateLeft(BSTVertex T) // pre-req: T.right != null
BSTVertex w = T.right // rotateRight is the mirror copy of this
w.parent = T.parent // this method is hard to get right for newbie
T.parent = w
T.right = w.left
if (w.left != null) w.left.parent = T
w.left = T
// update the height of T and then w here
return w
🕑
Four Cases

Basically, there are only these four imbalance cases. We use Tree Rotation(s) to deal with each of them.

🕑
  1. Just insert v as in normal BST,
  2. Walk up the AVL Tree from the insertion point back to the root and at every step, we update the height and balance factor of the affected vertices:
    1. Stop at the first vertex that is out-of-balance (+2 or -2), if any,
    2. Use one of the four tree rotation cases to rebalance it again, e.g. try Insert(37) on the example above and notice by calling rotateLeft(29) once, we fix the imbalance issue.

Discussion: Is there other tree rotation cases for Insert(v) operation of AVL Tree?

🕑

The content of this interesting slide (the answer of the usually intriguing discussion point from the earlier slide) is hidden and only available for legitimate CS lecturer worldwide. This mechanism is used in the various flipped classrooms in NUS.


If you are really a CS lecturer (or an IT teacher) (outside of NUS) and are interested to know the answers, please drop an email to stevenhalim at gmail dot com (show your University staff profile/relevant proof to Steven) for Steven to manually activate this CS lecturer-only feature for you.


FAQ: This feature will NOT be given to anyone else who is not a CS lecturer.

🕑
  1. Just remove v as in normal BST (one of the three removal cases),
  2. Walk up the AVL Tree from the deletion point back to the root and at every step, we update the height and balance factor of the affected vertices:
    1. Now for every vertex that is out-of-balance (+2 or -2), we use one of the four tree rotation cases to rebalance them (can be more than one) again.

The main difference compared to Insert(v) in AVL tree is that we may trigger one of the four possible rebalancing cases several times, but not more than h = O(log N) times :O, try Remove(7) on the example above to see two chain reactions rotateRight(6) and then rotateRight(16)+rotateLeft(8) combo.

🕑

We have now see how AVL Tree defines the height-balance invariant, maintain it for all vertices during Insert(v) and Remove(v) update operations, and a proof that AVL Tree has h < 2 * log N.


Therefore, all BST operations (both update and query operations except Inorder Traversal) that we have learned so far, if they have time complexity of O(h), they have time complexity of O(log N) if we use AVL Tree version of BST.


This marks the end of this e-Lecture, but please switch to 'Exploration Mode' and try making various calls to Insert(v) and Remove(v) in AVL Tree mode to strengthen your understanding of this data structure.


PS: If you want to study how these seemingly complex AVL Tree (rotation) operations are implemented in a real program, you can download this AVLDemo.cpp (must be used together with this BSTDemo.cpp).

🕑

We will end this module with a few more interesting things about BST and balanced BST (especially AVL Tree).

🕑

The content of this interesting slide (the answer of the usually intriguing discussion point from the earlier slide) is hidden and only available for legitimate CS lecturer worldwide. This mechanism is used in the various flipped classrooms in NUS.


If you are really a CS lecturer (or an IT teacher) (outside of NUS) and are interested to know the answers, please drop an email to stevenhalim at gmail dot com (show your University staff profile/relevant proof to Steven) for Steven to manually activate this CS lecturer-only feature for you.


FAQ: This feature will NOT be given to anyone else who is not a CS lecturer.

🕑

The content of this interesting slide (the answer of the usually intriguing discussion point from the earlier slide) is hidden and only available for legitimate CS lecturer worldwide. This mechanism is used in the various flipped classrooms in NUS.


If you are really a CS lecturer (or an IT teacher) (outside of NUS) and are interested to know the answers, please drop an email to stevenhalim at gmail dot com (show your University staff profile/relevant proof to Steven) for Steven to manually activate this CS lecturer-only feature for you.


FAQ: This feature will NOT be given to anyone else who is not a CS lecturer.

🕑

For a few more interesting questions about this data structure, please practice on BST/AVL training module (no login is required).


However, for registered users, you should login and then go to the Main Training Page to officially clear this module and such achievement will be recorded in your user account.

🕑

We also have a few programming problems that somewhat requires the usage of this balanced BST (like AVL Tree) data structure: Kattis - compoundwords and Kattis - baconeggsandspam.


Try them to consolidate and improve your understanding about this data structure. You are allowed to use C++ STL map/set, Java TreeMap/TreeSet, or OCaml Map/Set if that simplifies your implementation (Note that Python doesn't have built-in bBST implementation).

🕑

The content of this interesting slide (the answer of the usually intriguing discussion point from the earlier slide) is hidden and only available for legitimate CS lecturer worldwide. This mechanism is used in the various flipped classrooms in NUS.


If you are really a CS lecturer (or an IT teacher) (outside of NUS) and are interested to know the answers, please drop an email to stevenhalim at gmail dot com (show your University staff profile/relevant proof to Steven) for Steven to manually activate this CS lecturer-only feature for you.


FAQ: This feature will NOT be given to anyone else who is not a CS lecturer.


You have reached the last slide. Return to 'Exploration Mode' to start exploring!

Note that if you notice any bug in this visualization or if you want to request for a new visualization feature, do not hesitate to drop an email to the project leader: Dr Steven Halim via his email address: stevenhalim at gmail dot com.

🕑

Visualisation Scale

Create

Insert

Remove

Predec-/Succ-essor

Tree Traversal

>

1.0x (Default)

0.5x (Minimal Details)

Empty

Unbalanced Example

Balanced Example

Random

Random Large

Skewed Left

Skewed Right

v =

Go

v =

Go

v =

Get Predecessor

Get Successor

Inorder

Preorder

Postorder

We use cookies to improve our website.
By clicking ACCEPT, you agree to our use of Google Analytics for analysing user behaviour and improving user experience as described in our Privacy Policy.
By clicking reject, only cookies necessary for site functions will be used.

ACCEPT REJECT
About Team Terms of use Privacy Policy

About

VisuAlgo was conceptualised in 2011 by Dr Steven Halim as a tool to help his students better understand data structures and algorithms, by allowing them to learn the basics on their own and at their own pace.

VisuAlgo contains many advanced algorithms that are discussed in Dr Steven Halim's book ('Competitive Programming', co-authored with his brother Dr Felix Halim and his friend Dr Suhendry Effendy) and beyond. Today, a few of these advanced algorithms visualization/animation can only be found in VisuAlgo.

Though specifically designed for National University of Singapore (NUS) students taking various data structure and algorithm classes (e.g., CS1010/equivalent, CS2040/equivalent, CS3230, CS3233, and CS4234), as advocators of online learning, we hope that curious minds around the world will find these visualizations useful too.

VisuAlgo is not designed to work well on small touch screens (e.g., smartphones) from the outset due to the need to cater for many complex algorithm visualizations that require lots of pixels and click-and-drag gestures for interaction. The minimum screen resolution for a respectable user experience is 1024x768 and only the landing page is relatively mobile-friendly. However, we are currently experimenting with a mobile (lite) version of VisuAlgo to be ready by April 2022.

VisuAlgo is an ongoing project and more complex visualizations are still being developed.

The most exciting development is the automated question generator and verifier (the online quiz system) that allows students to test their knowledge of basic data structures and algorithms. The questions are randomly generated via some rules and students' answers are instantly and automatically graded upon submission to our grading server. This online quiz system, when it is adopted by more CS instructors worldwide, should technically eliminate manual basic data structure and algorithm questions from typical Computer Science examinations in many Universities. By setting a small (but non-zero) weightage on passing the online quiz, a CS instructor can (significantly) increase his/her students mastery on these basic questions as the students have virtually infinite number of training questions that can be verified instantly before they take the online quiz. The training mode currently contains questions for 12 visualization modules. We will soon add the remaining 12 visualization modules so that every visualization module in VisuAlgo have online quiz component.

We have translated VisuAlgo pages into three main languages: English, Chinese, and Indonesian. Currently, we have also written public notes about VisuAlgo in various languages:

id, kr, vn, th.

Team

Project Leader & Advisor (Jul 2011-present)
Dr Steven Halim, Senior Lecturer, School of Computing (SoC), National University of Singapore (NUS)
Dr Felix Halim, Senior Software Engineer, Google (Mountain View)

Undergraduate Student Researchers 1 (Jul 2011-Apr 2012)
Koh Zi Chun, Victor Loh Bo Huai

Final Year Project/UROP students 1 (Jul 2012-Dec 2013)
Phan Thi Quynh Trang, Peter Phandi, Albert Millardo Tjindradinata, Nguyen Hoang Duy

Final Year Project/UROP students 2 (Jun 2013-Apr 2014)
Rose Marie Tan Zhao Yun, Ivan Reinaldo

Undergraduate Student Researchers 2 (May 2014-Jul 2014)
Jonathan Irvin Gunawan, Nathan Azaria, Ian Leow Tze Wei, Nguyen Viet Dung, Nguyen Khac Tung, Steven Kester Yuwono, Cao Shengze, Mohan Jishnu

Final Year Project/UROP students 3 (Jun 2014-Apr 2015)
Erin Teo Yi Ling, Wang Zi

Final Year Project/UROP students 4 (Jun 2016-Dec 2017)
Truong Ngoc Khanh, John Kevin Tjahjadi, Gabriella Michelle, Muhammad Rais Fathin Mudzakir

Final Year Project/UROP students 5 (Aug 2021-Dec 2022)
Liu Guangyuan, Manas Vegi, Sha Long, Vuong Hoang Long

Final Year Project/UROP students 6 (Aug 2022-Apr 2023)
Lim Dewen Aloysius, Ting Xiao

List of translators who have contributed ≥100 translations can be found at statistics page.

Acknowledgements
This project is made possible by the generous Teaching Enhancement Grant from NUS Centre for Development of Teaching and Learning (CDTL).

Terms of use

VisuAlgo is free of charge for Computer Science community on earth. If you like VisuAlgo, the only "payment" that we ask of you is for you to tell the existence of VisuAlgo to other Computer Science students/instructors that you know =) via Facebook/Twitter/Instagram/TikTok posts, course webpages, blog reviews, emails, etc.

If you are a data structure and algorithm student/instructor, you are allowed to use this website directly for your classes. If you take screen shots (videos) from this website, you can use the screen shots (videos) elsewhere as long as you cite the URL of this website (https://visualgo.net) and/or list of publications below as reference. However, you are NOT allowed to download VisuAlgo (client-side) files and host it on your own website as it is plagiarism. As of now, we do NOT allow other people to fork this project and create variants of VisuAlgo. Using the offline copy of (client-side) VisuAlgo for your personal usage is fine.

Note that VisuAlgo's online quiz component is by nature has heavy server-side component and there is no easy way to save the server-side scripts and databases locally. Currently, the general public can only use the 'training mode' to access these online quiz system. Currently the 'test mode' is a more controlled environment for using these randomly generated questions and automatic verification for real examinations in NUS.

List of Publications

This work has been presented briefly at the CLI Workshop at the ICPC World Finals 2012 (Poland, Warsaw) and at the IOI Conference at IOI 2012 (Sirmione-Montichiari, Italy). You can click this link to read our 2012 paper about this system (it was not yet called VisuAlgo back in 2012) and this link for the short update in 2015 (to link VisuAlgo name with the previous project).

This work is done mostly by my past students. 

Bug Reports or Request for New Features

VisuAlgo is not a finished project. Dr Steven Halim is still actively improving VisuAlgo. If you are using VisuAlgo and spot a bug in any of our visualization page/online quiz tool or if you want to request for new features, please contact Dr Steven Halim. His contact is the concatenation of his name and add gmail dot com.

Privacy Policy

Version 1.1 (Updated Fri, 14 Jan 2022).

Disclosure to all visitors: We currently use Google Analytics to get an overview understanding of our site visitors. We now give option for user to Accept or Reject this tracker.

Since Wed, 22 Dec 2021, only National University of Singapore (NUS) staffs/students and approved CS lecturers outside of NUS who have written a request to Steven can login to VisuAlgo, anyone else in the world will have to use VisuAlgo as an anonymous user that is not really trackable other than what are tracked by Google Analytics.

For NUS students enrolled in modules that uses VisuAlgo: By using a VisuAlgo account (a tuple of NUS official email address, NUS official student name as in the class roster, and a password that is encrypted on the server side — no other personal data is stored), you are giving a consent for your module lecturer to keep track of your e-lecture slides reading and online quiz training progresses that is needed to run the module smoothly. Your VisuAlgo account will also be needed for taking NUS official VisuAlgo Online Quizzes and thus passing your account credentials to another person to do the Online Quiz on your behalf constitutes an academic offense. Your user account will be purged after the conclusion of the module unless you choose to keep your account (OPT-IN). Access to the full VisuAlgo database (with encrypted passwords) is limited to Steven himself.

For other NUS students, you can self-register a VisuAlgo account by yourself (OPT-IN).

For other CS lecturers worldwide who have written to Steven, a VisuAlgo account (your (non-NUS) email address, you can use any display name, and encrypted password) is needed to distinguish your online credential versus the rest of the world. Your account will be tracked similarly as a normal NUS student account above but it will have CS lecturer specific features, namely the ability to see the hidden slides that contain (interesting) answers to the questions presented in the preceding slides before the hidden slides. You can also access Hard setting of the VisuAlgo Online Quizzes. You can freely use the material to enhance your data structures and algorithm classes. Note that there can be other CS lecturer specific features in the future.

For anyone with VisuAlgo account, you can remove your own account by yourself should you wish to no longer be associated with VisuAlgo tool.