Register for our webinar

How to Nail your next Technical Interview

1 hour
Loading...
1
Enter details
2
Select webinar slot
*Invalid Name
*Invalid Name
By sharing your contact details, you agree to our privacy policy.
Step 1
Step 2
Congratulations!
You have registered for our webinar
check-mark
Oops! Something went wrong while submitting the form.
1
Enter details
2
Select webinar slot
*All webinar slots are in the Asia/Kolkata timezone
Step 1
Step 2
check-mark
Confirmed
You are scheduled with Interview Kickstart.
Redirecting...
Oops! Something went wrong while submitting the form.
close-icon
Iks white logo

You may be missing out on a 66.5% salary hike*

Nick Camilleri

Head of Career Skills Development & Coaching
*Based on past data of successful IK students
Iks white logo
Help us know you better!

How many years of coding experience do you have?

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Iks white logo

FREE course on 'Sorting Algorithms' by Omkar Deshpande (Stanford PhD, Head of Curriculum, IK)

Thank you! Please check your inbox for the course details.
Oops! Something went wrong while submitting the form.
Our June 2021 cohorts are filling up quickly. Join our free webinar to Uplevel your career
close
closeAbout usWhy usInstructorsReviewsCostFAQContactBlogRegister for Webinar

Data Structures and Algorithms: AVL Trees

Last updated by Utkarsh Sahu on Sep 25, 2024 at 10:58 PM | Reading time: 16 minutes

The fast well prepared banner

Attend our Free Webinar on How to Nail Your Next Technical Interview

WEBINAR +LIVE Q&A

How To Nail Your Next Tech Interview

Data Structures and Algorithms: AVL Trees
Hosted By
Ryan Valles
Founder, Interview Kickstart
strategy
Our tried & tested strategy for cracking interviews
prepare list
How FAANG hiring process works
hiring process
The 4 areas you must prepare for
hiring managers
How you can accelerate your learnings

Basic operations on a tree include search, insertion, and deletion. AVL trees can perform all these operations in just O(logN) time complexity, where “N” is the number of nodes in the tree. Therefore, every Software Engineer must be familiar with AVL trees. 

Software engineering interview problems involving operations on trees can be solved using AVL trees. This article covers the topic of AVL Trees in detail to help you with your tech interview prep.

Prerequisites: A binary search tree is a prerequisite before learning about AVL Trees. You can read about Binary Search Trees here:  BinarySearchTree.

List of topics covered in this article:

  • What Is an AVL Tree?
  • What Is the Balance Factor in an AVL Tree?
  • Why and When to use AVL Trees
  • What Is AVL Tree Rotation?
  • AVL Tree Insertion Process
  • Search for a Node in an AVL Tree
  • AVL Tree Implementation and Code
  • Complexity Analysis of AVL Tree
  • Comparison with Red-Black Tree
  • Binary Search Tree Vs. AVL Tree
  • Strengths and Weaknesses of AVL Tree
  • Solve Problems Related to AVL Tree
  • Applications of AVL Tree

What Is an AVL Tree?

AVL tree is a self-balancing Binary Search Tree named after its inventors, Adelson-Velskii and Landis. For each node in an AVL tree, the difference between the heights of the left and right subtrees is either 1, 0, or -1. The Balance Factor of a node refers to the difference between the heights of the left and right subtrees. 

AVL tree operations are similar to BST’s, but the key difference is restoring the height balance of the subtrees. Restoring of the height of the tree is done using four types of rotations, namely: Left-Left (LL), Right-Right (RR), Left-Right (LR), and Right-Left (RL). 

What Is the Balance Factor in an AVL Tree?

In a Binary Tree, the balance factor of a node is defined as the difference between the heights of its left subtree and right subtree. For a Binary Tree, let “Node” represent any node in the tree. Then,

BalanceFactor( Node ) = Height( LeftSubtree( Node ) ) - Height( RightSubtree( Node ) )     …...i

We can also represent the BalanceFactor of a node as: 

BalanceFactor( Node ) = Height( RightSubtree( Node ) ) -  Height( LeftSubtree( Node ) )    .….ii

For the sake of convenience, we will consider the first equation while defining terms in the rest of the article below.

In the case of an AVL tree, the balance factor can comprise of only three possible values: { -1, 0, 1 } that is

BalanceFactor ( Node ) = { -1 , 0 , 1 } .  Here Node is any node in the AVL tree.

After inserting a new node or deleting an existing node from an AVL tree, the Balance Factors for all nodes are calculated. If for any node the balance factor is greater than one or less than -1, the tree will rebalance by performing rotations; this is discussed further in later sections.

Why and When to Use an AVL Tree? 

Let’s look at an example first for a better understanding. 

Suppose we have a database that contains the list of all existing metro stations in the country. 

  • We can say that the number of new metro stations added to our database will be very low. At most, one or two new metro stations are built every year by the government. Hence the number of insertions in our database becomes very less. 
  • But the number of search operations is relatively very high as many people will try to search for a metro station to book a ticket. 

Here we can represent our database as an AVL tree. The reason is that the number of search operations required is very high compared to the number of insertions required.

Theoretically, AVL Tree is a self-balancing BST, so the complexity of insertion and lookup takes O (log N) time in both the average and worst case, where “N” is the number of nodes in the tree before the operation. 

But the problem with AVL trees is that when we insert a new node, we need to perform rotation to make the tree balanced, which is a costly operation. So for use cases that require more insertions than search, AVL is not the best fit. It’s efficient when we need more search operations than insertions. 

What Is AVL Tree Rotation?

AVL tree rotations are required to balance the tree at all levels. There are four types of rotations.  

  • Single Left Rotation (LL Rotation)
  • Single Right Rotation (RR Rotation)
  • Left Right Rotation (LR Rotation)
  • Right Left Rotation (RL Rotation)

The violation occurs when the balance factor for any node in the tree does not belong to the set {-1,0,1}.

Note: In the diagrams below for LL, RR, LR, and RL rotations, the nodes A, B, C, and D are balanced subtrees.

Single Left Rotation (LL rotation)

Order of nodes before rotation:

A < X < B < Y < C 

Order of nodes after rotation:

A < X < B < Y < C

Note that after left rotation, the order of invariants remains the same.  Here, Left rotation is required because X is imbalanced. 

Below is the pseudocode to perform Left rotation on a node in the case of LL :

function rotateLeft(current)

    newRoot = current → right

 

    current → right = newRoot → left

    newRoot → left = current

    updateHeightOfChildren(current)

    updateHeightOfChildren(newRoot)

    return newRoot

end function

Single Right Rotation (RR rotation)

Order of nodes before rotation:

A < X < B < Y < C 

Order of nodes after rotation:

A < X < B < Y < C

Note that after Right rotation, the order of invariants remains the same. Here, Right rotation is required because A has become too tall after an insertion.

Below is the pseudocode to perform Right rotation on a node in case of RR : 

function rotateRight(current)

    newRoot = current → left

    current → left = newRoot → right

    newRoot → right = current

    updateHeightOfChildren(current)

    updateHeightOfChildren(newRoot)

    return newRoot

end function

Left Right Rotation (LR Rotation)

Order of nodes before rotation:

A < X < B < Y < C < Z < D

Order of nodes after rotation:

A < X < B < Y < C < Z < D

Note that after Left-Right rotation, the order of invariants remains the same. Here, Left-Right rotation is required because the subtree rooted at Y has become too tall. 

Below is the pseudocode to perform Left-Right rotation on a node in case of LR :

function leftRightRotate(current)

     current → left = leftRotate(current → left)

return rightRotate(current)

end function

Right Left Rotation (RL Rotation)

Order of nodes before rotation:

A < X < B < Z < C < Y < D 

Order of nodes after rotation: 

A < X< B < Z < C < Y < D  

Note that after Right-Left rotation, the order of invariants remains the same. Here, Right-Left rotation is required because the subtree rooted at Z has become too tall. 

Below is the pseudocode to perform Right-Left rotation on a node in case of RL :

function rightLeftRotate(current)

     current → right = rightRotate(current → right)

return leftRotate(current)

end function

AVL Tree Insertion Process

For insertion of a new node in an AVL tree, firstly, the proper position of the node needs to be figured out, which is as per the Binary Search Tree rules.  

  • We start from the root and keep comparing its key with the new node’s key. 
  • If the key is greater, we go to the right subtree. If not, we go to the left subtree. 
  • Once we find the parent node, we add the new node to the left or right, depending on its key. 

By now, we have inserted our node, and our tree is a BST but is it an AVL tree?. No. 

We know that an AVL tree is height-balanced. So we now need to calculate the balance factor of all the nodes, and if some nodes are not balanced, we need to perform the appropriate rotations to keep the tree height balanced. 

Note: In this article, we have assumed that every node in the AVL tree will have a unique key. No two nodes can have the same key. 

Let us assume we already have “N” nodes in the tree before insertion, and the tree is height-balanced. In this case, we travel the tree’s height in the worst case to insert our new node, which is O(log N). Then, we retrace levels to maintain the balance of the tree if needed, which is also at max O (log N). Thus our complexity of inserting a new node in the worst case is O (log N) time.   

Steps to Follow for Insertion 

1. Recursively find the position of the new node in the AVL tree as per BST rules and insert it.

2. Now, while backtracking the path to the parent node from the newly inserted node, update the height of the current node in the AVL tree.

3. Since the height is updated, we can now compute the balance factor. 

4. If the balance factor is greater than 1 or less than -1, we perform rotations (LL, RR, LR, or RL based on the key inserted) and balance the tree. 

  • If the Balance Factor is greater than 1 for a node “nodeA”:
    a. If key < nodeA.left.key, then this means we inserted the new node to the left of the left of the current node. So this is a LL case, and we perform a right rotation.
    b. If key > nodeA.left.key, then this means we inserted the new node to the right of the left of the current node. So this is an LR case, and we perform a left rotation followed by a right rotation.
  • If the Balance Factor is less than -1 for a node “nodeA”:
    a. If key > nodeA.right.key, then this means we inserted the new node to the right of the right of the current node. So this is a RR case, and we perform a left rotation.
    b. If key < nodeA.right.key, then this means we inserted the new node to the left of the right of the current node. So this is an RL case, and we perform a right rotation followed by a left rotation.

Let’s understand with an example:

We need to insert the elements below in an AVL tree.

Elements : [ 40, 20 , 10, 25, 30, 22 ]

  • Insert 40, 20, and then 10 into the AVL tree.
  • The Balance Factor of node 40 becomes 2, and so we perform rotation to balance the tree.
  • Next, we insert 25 and 30. Now the balance factor of node 40 becomes 2, so we perform rotation to balance the tree.
  • Finally, we insert 22, and the balance factor of node 20 becomes -2, so we perform rotations and balance the tree.


Search for a Node in an AVL Tree

Searching for a node in an AVL tree is similar to that of a BST. We start from the root node and keep checking the searchKey with the key of the node. If searchKey equals the key of the node we return; if searchKey is greater, we search towards the right subtree. Otherwise, we search in the left subtree. 

The time complexity of the search function is based on the height of the tree, which is logN, N being the number of nodes. Hence the worst-case complexity of the search is O(logN).

Code:

public Node search(int key) {

      Node temp = root;

      while ( temp != null ) {

       if ( temp.key == key )

            return temp;

      else if ( temp.key < key )

                temp = temp.right;

           else

            temp = temp.left;

 }

      // Key Not found, so return null.

      return null;

}

AVL Tree Implementation and Code 

The code below inserts a node in the AVL Tree, balances the AVL tree if anything becomes unbalanced, and finally prints the inorder traversal of the tree.

import java.util.Objects;

class AVLTreeDataStructure {

   class AVLTree {

       private Node root;

       public class Node {

           int key;

           int height;

           Node left , right;

           public Node(int key) {

               this.key = key;

               this.height = 1;  

               this.left = null;

               this.right = null;

           }

       }

       public void insert(int key) {

           root = insert(root , key);

       }

       public void printAVLTree() {

           System.out.print("InorderTraversal : ");

           inOrderTraversal(root);

           System.out.println();

       }

       /**

        * Inserts a new node with the given key inside the AVL tree.

        * @param node Root node from where we start our search to find the position of the new node.

        * @param key of the new node to be inserted.

        * @return Root node of the AVL tree.

        */

       private Node insert(Node node , int key) {

           if (node == null) {

               return new Node(key);

           }

           if (node.key < key) {

               node.right = insert(node.right , key);

           } else if (node.key > key) {

               node.left = insert(node.left , key);

           } else if ( node.key == key ) {

               throw new IllegalStateException("Same key already Exists!");

           }

           node.height = updateNodeHeight(node);

          

           return selfBalanceBasedOnBalanceFactor(node , key);

       }

       private Node selfBalanceBasedOnBalanceFactor(Node node , int key) {

           int balanceFactor = balanceFactor(node);

           if (balanceFactor > 1) {

               if (key < node.left.key) {

                   return rotateRight(node);

               }

               else if (key > node.left.key) {

                   node.left =  rotateLeft(node.left);

                   return rotateRight(node);

               }

           } else if (balanceFactor < -1) {

               if (key > node.right.key) {

                   return rotateLeft(node);

               } else if (key < node.right.key) {

                   node.right = rotateRight(node.right);

                   return rotateLeft(node);

               }

           }

          

           return node;

       }

       private Node rotateLeft(Node parent) {

           Node rightOfParent = parent.right;

           Node rightOfLeftOfParent = parent.right.left;

           rightOfParent.left = parent;

           parent.right = rightOfLeftOfParent;

           updateNodeHeight(parent);

           updateNodeHeight(rightOfParent);

           return rightOfParent;

       }

       private Node rotateRight(Node parent) {

           Node leftOfParent = parent.left;

           Node leftOfRightOfParent = parent.left.right;

           leftOfParent.right = parent;

           parent.left = leftOfRightOfParent;

           updateNodeHeight(parent);

           updateNodeHeight(leftOfParent);

           return leftOfParent;

       }

       private int height(Node node) {

           return (node == null) ? 0 : node.height;

       }

       private int updateNodeHeight(Node node) {

           return Math.max( height(node.left) , height(node.right) ) + 1;

       }

       private int balanceFactor(Node node) {

           if (node == null)

               return 0;

           else

               return height(node.left) - height(node.right);

       }

       public Node search(int key) {

           Node temp = root;

           while ( temp != null ) {

               if ( temp.key == key )

                   return temp;

               else if ( temp.key < key )

                   temp = temp.right;

               else

                   temp = temp.left;

           }

           // Key Not found, so return null.

           return null;

       }

       private void inOrderTraversal(Node node) {

           if ( node == null ) {

               return;

           }

           inOrderTraversal(node.left);

           System.out.print(node.key+ " ");

           inOrderTraversal(node.right);

       }

   }

   public static void main(String[] args) {

       new AVLTreeDataStructure().solve();

   }

   void solve() {

       AVLTree avlTree = new AVLTree();

       avlTree.insert(3);

       avlTree.insert(2);

       avlTree.insert(1);

       avlTree.printAVLTree();

       System.out.println(Objects.nonNull(avlTree.search(2))?"Found" : "Not Found");

       System.out.println(Objects.nonNull(avlTree.search(4))?"Found" : "Not Found");

   }

}

Output

InorderTraversal : 1 2 3 

Found

Not Found

Complexity Analysis of AVL Tree

Let us now look at the time and space complexities associated with AVL Trees.

Time Complexity 

Consider `N` as the number of nodes of the tree. In the worst and average cases, basic data structure operations like insertion, deletion, and searching take O(logN). In such cases is where the AVL tree outperforms the binary search tree. 

The complexities for all operations — insertion, deletion, and search — depend on the tree’s height. The AVL tree is a self-balancing binary search tree, so the tree’s height in the worst case goes O(logN). The AVL tree’s guaranteed height h is O(logN). Balancing the tree rotations is necessary, and it takes O(1) time. So overall complexity in the worst case remains O(logN).

Time Complexity of AVL Tree

Space Complexity of AVL Tree


The space complexity of the AVL tree is O(N) in both the worst case and average case. 

Comparison With Red Black Tree 

A red-black tree and an AVL tree both are self-balancing trees. Both of them provide O(logN) for insertion and search operations. An AVL tree is more rigidly (strictly) balanced than a red-black tree. 

However, an AVL tree also comes with more rotation costs than a red-black tree. So when insertion of data is costly, a red-black tree is preferred. When lookups of data are high in comparison to insertion, then an AVL tree is preferred. 

Binary Search Tree vs AVL Tree

A binary search tree is not necessarily a balanced tree. So in the case of a skewed tree(left or right), the tree’s height will be O(N). In the worst case, for insert and search operations, it will take O(N). 

An AVL tree will always take O(logN) because it’s balanced, and the tree’s height is also guaranteed O(logN).

Strengths and Weaknesses of AVL Tree

Strengths:

  • The AVL's main advantage is its ability to self-balance. This self-balancing guarantees the performance of O(logN) in the worst case for all operations, including insertion, deletion, and searching. 
  • Finding the minimum and maximum element and other operations that can be done using BST can also be done using AVL Tree.

Weakness: 

  • The AVL Tree involves rotations and the balancing factor, which makes it a complicated data structure.
  • If there are many insertions, it will lead to a lot of rotations that can be avoided with the usage of other efficient data structures like the Red-Black Tree.

Solve Problems Related to AVL Tree 

1. Maintain a dynamic data structure that supports two operations:

  • Insert(S, y): if y is not in S, insert y into S
  • Delete(S, y): if y is in S, delete y from S
  • Also, you are given k queries. For each query, return the K-th smallest element in S.

2. Implement a data structure that supports insert(X), delete(X), search(X), findMin(), and findMax() in logarithmic time complexity. Here:

  • insert(X): insert element X into the structure.
  • delete(X): delete element X from the structure.
  • search(X): search element X inside the structure and return true if present, else false.
  • findMin: return the minimum element currently present in the set
  • findMax: return the maximum element currently present in the set

Applications of AVL Tree

  • AVL trees are beneficial when designing some database applications where the number of insertions and deletions is less than search operations. 

Example: Railway databases can take advantage of AVL Trees as new trains are seldom added, whereas many people often search for the available list of trains.

  • AVL Trees are used for all sorts of in-memory collections such as sets and dictionaries.

Are You Ready to Nail Your Next Coding Interview?

Whether you’re a Coding Engineer gunning for Software Developer or Software Engineer roles, or you’re targeting management positions at top companies, IK offers courses specifically designed for your needs to help you with your technical interview preparation!

If you’re looking for guidance and help with getting started, sign up for our free webinar. As pioneers in the field of technical interview prep, we have trained thousands of Software Engineers to crack the most challenging coding interviews and land jobs at their dream companies, such as Google, Facebook, Apple, Netflix, Amazon, and more!

Sign up now!

————

Article contributed by Problem Setters Official


Last updated on: 
November 20, 2024
Author

Utkarsh Sahu

Director, Category Management @ Interview Kickstart || IIM Bangalore || NITW.

Attend our Free Webinar on How to Nail Your Next Technical Interview

Register for our webinar

How to Nail your next Technical Interview

1
Enter details
2
Select webinar slot
First Name Required*
Last Name Required*
By sharing your contact details, you agree to our privacy policy.
Step 1
Step 2
Congratulations!
You have registered for our webinar
check-mark
Oops! Something went wrong while submitting the form.
1
Enter details
2
Select webinar slot
Step 1
Step 2
check-mark
Confirmed
You are scheduled with Interview Kickstart.
Redirecting...
Oops! Something went wrong while submitting the form.

Data Structures and Algorithms: AVL Trees

Worried About Failing Tech Interviews?

Attend our webinar on
"How to nail your next tech interview" and learn

Ryan-image
Hosted By
Ryan Valles
Founder, Interview Kickstart
blue tick
Our tried & tested strategy for cracking interviews
blue tick
How FAANG hiring process works
blue tick
The 4 areas you must prepare for
blue tick
How you can accelerate your learnings
Register for Webinar
entroll-image