Table of Contents
- Introduction to Trees
- Tree Terminology
- Tree Properties and Math
- Types of Trees
- Binary Tree
- Binary Search Tree (BST)
- AVL Tree (Self-Balancing)
- Heap (Priority Queue)
- Trie (Prefix Tree)
- Segment Tree
- N-ary Tree
- Tree Traversals
- Common Tree Problems
- Time and Space Complexity
- Best Practices
Introduction to Trees
A Tree is a non-linear, hierarchical data structure consisting of nodes connected by edges. It is a special type of graph with no cycles.
Real-World Analogies
- File System: Folders (directories) containing sub-folders and files
- HTML DOM: Tags nested inside other tags
- Organization Chart: CEO → Managers → Employees
- Decision Tree: Branching yes/no decisions
- Family Tree: Parent → Children → Grandchildren
Why Trees?
| Problem | Tree Solution |
|---|---|
| Hierarchical data storage | General tree / N-ary |
| Fast ordered search | BST → O(log n) |
| Auto-complete / prefix lookup | Trie |
| Priority-based processing | Heap |
| Range queries on arrays | Segment Tree |
| Self-balancing sorted structure | AVL / Red-Black Tree |
Tree Terminology
Visual Reference
A ← Root (Level 0)
/ \
B C ← Level 1
/ \ \
D E F ← Level 2 (Leaves: D, E, F)
/
G ← Level 3 (Leaf)PythonKey Terms
| Term | Definition | Example from above |
|---|---|---|
| Root | Top-most node; no parent | A |
| Leaf | Node with no children | D, E, F, G |
| Internal Node | Node with at least one child | A, B, C |
| Parent | Node directly above | B is parent of D and E |
| Child | Direct descendant of a node | D and E are children of B |
| Sibling | Nodes sharing the same parent | B and C are siblings |
| Ancestor | All nodes on path from root to a node | A, B are ancestors of D |
| Descendant | All nodes in a node’s subtree | D, E, G are descendants of B |
| Edge | Link between parent and child | A→B, B→D |
| Path | Sequence of nodes connected by edges | A → B → D → G |
| Height | Longest path from a node down to a leaf | Height of tree = 3 |
| Depth | Distance (edges) from root to the node | Depth of G = 3 |
| Level | Depth of node (root at level 0) | G is at level 3 |
| Degree | Number of children a node has | Degree of B = 2 |
| Subtree | A node and all its descendants | B with D, E, G |
| Forest | Collection of disjoint trees | Remove A → two trees |
Tree Properties and Math
def tree_properties():
"""Mathematical properties of binary trees"""
return {
"Max nodes at level L" : "2^L",
"Max nodes in tree of height H" : "2^(H+1) - 1",
"Min height for N nodes" : "⌊log₂(N)⌋",
"Min nodes for height H" : "H + 1 (skewed)",
"Max nodes for height H" : "2^(H+1) - 1 (perfect)",
"Edges in a tree with N nodes" : "N - 1",
"Height of empty tree" : "-1",
"Height of single-node tree" : "0",
"Leaf nodes in full binary tree" : "(internal nodes) + 1",
}PythonTypes of Trees
| Type | Key Property | Use Case |
|---|---|---|
| Binary Tree | Each node has at most 2 children | Base for many tree structures |
| BST | Left < Root < Right | Dictionary, sorted data |
| AVL Tree | Balanced BST, height diff ≤ 1 | Frequent lookups |
| Red-Black Tree | Balanced BST, color-coded nodes | STL map/set, Java TreeMap |
| Heap | Parent ≥ children (max) or ≤ (min) | Priority queue, heap sort |
| Trie | Stores characters per edge/node | Auto-complete, spell-check |
| Segment Tree | Stores range aggregates | Range sum/min/max queries |
| B-Tree | Multi-key nodes, balanced | Databases, file systems |
| N-ary Tree | Each node has up to N children | File systems, JSON/XML |
Binary Tree
1. Node and Tree Structure
class TreeNode:
def __init__(self, val=0, left=None, right=None):
self.val = val
self.left = left
self.right = right
def __repr__(self):
return f"TreeNode({self.val})"
class BinaryTree:
def __init__(self):
self.root = None
def insert_level_order(self, values):
"""Build tree from list using level-order (BFS) insertion."""
if not values:
return
from collections import deque
self.root = TreeNode(values[0])
queue = deque([self.root])
i = 1
while queue and i < len(values):
node = queue.popleft()
if i < len(values) and values[i] is not None:
node.left = TreeNode(values[i])
queue.append(node.left)
i += 1
if i < len(values) and values[i] is not None:
node.right = TreeNode(values[i])
queue.append(node.right)
i += 1
def height(self, node=None):
"""Height of tree (longest path from root to leaf)."""
if node is None:
node = self.root
if node is None:
return -1
return 1 + max(self.height(node.left), self.height(node.right))
def size(self, node=None):
"""Total number of nodes."""
if node is None:
node = self.root
if node is None:
return 0
return 1 + self.size(node.left) + self.size(node.right)
def is_empty(self):
return self.root is NonePython2. Types of Binary Trees
# --- Full Binary Tree ---
# Every node has 0 or 2 children
# 1
# / \
# 2 3
# / \
# 4 5
# --- Complete Binary Tree ---
# All levels full except last, filled left to right
# 1
# / \
# 2 3
# / \ /
# 4 5 6
# --- Perfect Binary Tree ---
# All internal nodes have 2 children; all leaves at same level
# 1
# / \
# 2 3
# / \ / \
# 4 5 6 7
# --- Degenerate (Skewed) Tree ---
# Each node has only one child (like a linked list)
# 1 → 2 → 3 → 4 (right-skewed)Python3. Checking Tree Types
def is_full_binary_tree(root):
"""Every node has 0 or 2 children."""
if root is None:
return True
if root.left is None and root.right is None:
return True # leaf
if root.left and root.right:
return (is_full_binary_tree(root.left) and
is_full_binary_tree(root.right))
return False # one child only
def is_perfect_binary_tree(root):
"""All leaves at same depth, all internal nodes have 2 children."""
def depth(node):
d = 0
while node.left:
d += 1
node = node.left
return d
def check(node, level, d):
if node is None:
return True
if node.left is None and node.right is None:
return d == level
if node.left is None or node.right is None:
return False
return (check(node.left, level + 1, d) and
check(node.right, level + 1, d))
if root is None:
return True
d = depth(root)
return check(root, 0, d)
def is_complete_binary_tree(root):
"""All levels full except last, filled from left."""
from collections import deque
if root is None:
return True
queue = deque([root])
found_null = False
while queue:
node = queue.popleft()
if node.left:
if found_null:
return False
queue.append(node.left)
else:
found_null = True
if node.right:
if found_null:
return False
queue.append(node.right)
else:
found_null = True
return True
def is_balanced(root):
"""Height difference of left/right subtrees <= 1 at every node."""
def check(node):
if node is None:
return 0 # height = 0 for None
left_h = check(node.left)
if left_h == -1:
return -1
right_h = check(node.right)
if right_h == -1:
return -1
if abs(left_h - right_h) > 1:
return -1
return 1 + max(left_h, right_h)
return check(root) != -1PythonBinary Search Tree (BST)
A BST is a binary tree where:
- All values in left subtree < node value
- All values in right subtree > node value
- Both subtrees are also BSTs
Visual
8
/ \
3 10
/ \ \
1 6 14
/ \ /
4 7 13PythonImplementation
class BST:
def __init__(self):
self.root = None
# ── Insert ──────────────────────────────────────────────────────
def insert(self, val):
self.root = self._insert(self.root, val)
def _insert(self, node, val):
if node is None:
return TreeNode(val)
if val < node.val:
node.left = self._insert(node.left, val)
elif val > node.val:
node.right = self._insert(node.right, val)
# duplicates ignored
return node
# ── Search ──────────────────────────────────────────────────────
def search(self, val):
return self._search(self.root, val)
def _search(self, node, val):
if node is None or node.val == val:
return node
if val < node.val:
return self._search(node.left, val)
return self._search(node.right, val)
# ── Delete ──────────────────────────────────────────────────────
def delete(self, val):
self.root = self._delete(self.root, val)
def _delete(self, node, val):
if node is None:
return node
if val < node.val:
node.left = self._delete(node.left, val)
elif val > node.val:
node.right = self._delete(node.right, val)
else:
# Case 1: no child
if node.left is None and node.right is None:
return None
# Case 2: one child
if node.left is None:
return node.right
if node.right is None:
return node.left
# Case 3: two children → replace with inorder successor
successor = self._min_node(node.right)
node.val = successor.val
node.right = self._delete(node.right, successor.val)
return node
def _min_node(self, node):
while node.left:
node = node.left
return node
# ── Min / Max ───────────────────────────────────────────────────
def find_min(self):
if not self.root:
return None
return self._min_node(self.root).val
def find_max(self):
node = self.root
while node.right:
node = node.right
return node.val
# ── Inorder (sorted output) ─────────────────────────────────────
def inorder(self):
result = []
def _inorder(node):
if node:
_inorder(node.left)
result.append(node.val)
_inorder(node.right)
_inorder(self.root)
return result
# ── Validate BST ────────────────────────────────────────────────
def is_valid_bst(self):
def validate(node, min_val=float('-inf'), max_val=float('inf')):
if node is None:
return True
if not (min_val < node.val < max_val):
return False
return (validate(node.left, min_val, node.val) and
validate(node.right, node.val, max_val))
return validate(self.root)
# ── Floor and Ceiling ───────────────────────────────────────────
def floor(self, val):
"""Largest value <= val in BST."""
result = None
node = self.root
while node:
if node.val == val:
return val
elif node.val < val:
result = node.val
node = node.right
else:
node = node.left
return result
def ceiling(self, val):
"""Smallest value >= val in BST."""
result = None
node = self.root
while node:
if node.val == val:
return val
elif node.val > val:
result = node.val
node = node.left
else:
node = node.right
return result
# ── K-th Smallest ───────────────────────────────────────────────
def kth_smallest(self, k):
"""Return k-th smallest element (1-indexed)."""
stack, node = [], self.root
count = 0
while stack or node:
while node:
stack.append(node)
node = node.left
node = stack.pop()
count += 1
if count == k:
return node.val
node = node.right
return None
# ── Usage ────────────────────────────────────────────────────────────
bst = BST()
for v in [8, 3, 10, 1, 6, 14, 4, 7, 13]:
bst.insert(v)
print("Inorder (sorted):", bst.inorder()) # [1, 3, 4, 6, 7, 8, 10, 13, 14]
print("Min:", bst.find_min()) # 1
print("Max:", bst.find_max()) # 14
print("Floor(5):", bst.floor(5)) # 4
print("Ceiling(5):", bst.ceiling(5)) # 6
print("3rd smallest:", bst.kth_smallest(3)) # 4
print("Is valid BST:", bst.is_valid_bst()) # TruePythonBST Time Complexity
| Operation | Average | Worst (skewed) |
|---|---|---|
| Search | O(log n) | O(n) |
| Insert | O(log n) | O(n) |
| Delete | O(log n) | O(n) |
| Min/Max | O(log n) | O(n) |
| Inorder | O(n) | O(n) |
AVL Tree (Self-Balancing)
An AVL Tree is a BST where the balance factor (height difference between left and right subtree) is always -1, 0, or +1. Rotations are performed on insertion/deletion to maintain balance.
Balance Factor
balance_factor(node) = height(left) - height(right)
Balanced: -1, 0, or +1
Left-heavy: > +1 → Right rotation needed
Right-heavy: < -1 → Left rotation neededPythonImplementation
class AVLNode:
def __init__(self, val):
self.val = val
self.left = None
self.right = None
self.height = 1
class AVLTree:
def __init__(self):
self.root = None
def _height(self, node):
return node.height if node else 0
def _balance_factor(self, node):
return self._height(node.left) - self._height(node.right) if node else 0
def _update_height(self, node):
node.height = 1 + max(self._height(node.left), self._height(node.right))
# ── Rotations ───────────────────────────────────────────────────
def _rotate_right(self, y):
"""
y x
/ \ / \
x T3 → T1 y
/ \ / \
T1 T2 T2 T3
"""
x = y.left
T2 = x.right
x.right = y
y.left = T2
self._update_height(y)
self._update_height(x)
return x
def _rotate_left(self, x):
"""
x y
/ \ / \
T1 y → x T3
/ \ / \
T2 T3 T1 T2
"""
y = x.right
T2 = y.left
y.left = x
x.right = T2
self._update_height(x)
self._update_height(y)
return y
def _balance(self, node):
self._update_height(node)
bf = self._balance_factor(node)
# Left-heavy
if bf > 1:
if self._balance_factor(node.left) < 0: # Left-Right case
node.left = self._rotate_left(node.left)
return self._rotate_right(node) # Left-Left case
# Right-heavy
if bf < -1:
if self._balance_factor(node.right) > 0: # Right-Left case
node.right = self._rotate_right(node.right)
return self._rotate_left(node) # Right-Right case
return node
# ── Insert ──────────────────────────────────────────────────────
def insert(self, val):
self.root = self._insert(self.root, val)
def _insert(self, node, val):
if node is None:
return AVLNode(val)
if val < node.val:
node.left = self._insert(node.left, val)
elif val > node.val:
node.right = self._insert(node.right, val)
else:
return node # no duplicates
return self._balance(node)
# ── Delete ──────────────────────────────────────────────────────
def delete(self, val):
self.root = self._delete(self.root, val)
def _delete(self, node, val):
if node is None:
return node
if val < node.val:
node.left = self._delete(node.left, val)
elif val > node.val:
node.right = self._delete(node.right, val)
else:
if node.left is None:
return node.right
if node.right is None:
return node.left
# Replace with inorder successor
successor = node.right
while successor.left:
successor = successor.left
node.val = successor.val
node.right = self._delete(node.right, successor.val)
return self._balance(node)
def inorder(self):
result = []
def _inorder(node):
if node:
_inorder(node.left)
result.append(node.val)
_inorder(node.right)
_inorder(self.root)
return result
# ── Usage ────────────────────────────────────────────────────────────
avl = AVLTree()
for v in [10, 20, 30, 40, 50, 25]:
avl.insert(v)
print("AVL inorder:", avl.inorder()) # [10, 20, 25, 30, 40, 50]
# Tree stays balanced — no O(n) degradation like skewed BSTPythonAVL vs BST
| Aspect | BST | AVL Tree |
|---|---|---|
| Search | O(log n) avg | O(log n) guaranteed |
| Insert | O(log n) avg | O(log n) guaranteed |
| Rotations | None | O(1) per operation |
| Memory | Less | Slightly more (height) |
| Best for | Static data | Frequent updates |
Heap (Priority Queue)
A Heap is a complete binary tree satisfying the heap property:
- Max-Heap: Parent ≥ children (root is the maximum)
- Min-Heap: Parent ≤ children (root is the minimum)
Heaps are usually stored as arrays (not explicit tree nodes).
Array Index Relationships
For node at index i:
Left child → 2*i + 1
Right child → 2*i + 2
Parent → (i - 1) // 2PythonImplementation
class MinHeap:
def __init__(self):
self.heap = []
def _parent(self, i): return (i - 1) // 2
def _left(self, i): return 2 * i + 1
def _right(self, i): return 2 * i + 2
def push(self, val):
"""Insert element and sift up — O(log n)."""
self.heap.append(val)
self._sift_up(len(self.heap) - 1)
def _sift_up(self, i):
while i > 0:
parent = self._parent(i)
if self.heap[i] < self.heap[parent]:
self.heap[i], self.heap[parent] = self.heap[parent], self.heap[i]
i = parent
else:
break
def pop(self):
"""Remove and return minimum — O(log n)."""
if not self.heap:
raise IndexError("Heap is empty")
if len(self.heap) == 1:
return self.heap.pop()
min_val = self.heap[0]
self.heap[0] = self.heap.pop() # move last element to root
self._sift_down(0)
return min_val
def _sift_down(self, i):
n = len(self.heap)
while True:
smallest = i
left = self._left(i)
right = self._right(i)
if left < n and self.heap[left] < self.heap[smallest]:
smallest = left
if right < n and self.heap[right] < self.heap[smallest]:
smallest = right
if smallest == i:
break
self.heap[i], self.heap[smallest] = self.heap[smallest], self.heap[i]
i = smallest
def peek(self):
"""Return minimum without removing — O(1)."""
if not self.heap:
raise IndexError("Heap is empty")
return self.heap[0]
def heapify(self, arr):
"""Build heap from list — O(n)."""
self.heap = arr[:]
n = len(self.heap)
for i in range(n // 2 - 1, -1, -1):
self._sift_down(i)
def __len__(self):
return len(self.heap)
def __repr__(self):
return f"MinHeap({self.heap})"
# ── Python's built-in heapq (min-heap) ─────────────────────────────
import heapq
nums = [5, 1, 8, 3, 2]
heapq.heapify(nums) # convert list to heap in-place O(n)
heapq.heappush(nums, 0) # insert
print(heapq.heappop(nums)) # removes+returns 0 → min element
# Max-heap trick: negate values
max_heap = [-x for x in [5, 1, 8, 3, 2]]
heapq.heapify(max_heap)
print(-heapq.heappop(max_heap)) # → 8 (maximum)
# K largest elements
nums = [3, 1, 4, 1, 5, 9, 2, 6]
print(heapq.nlargest(3, nums)) # [9, 6, 5]
print(heapq.nsmallest(3, nums)) # [1, 1, 2]
# ── Usage ────────────────────────────────────────────────────────────
h = MinHeap()
for v in [5, 3, 8, 1, 2]:
h.push(v)
print(h) # MinHeap([1, 2, 8, 5, 3])
print(h.pop()) # 1
print(h.peek()) # 2PythonHeap Time Complexity
| Operation | Time | Notes |
|---|---|---|
| push | O(log n) | sift up |
| pop (min/max) | O(log n) | sift down |
| peek | O(1) | root element |
| heapify (build) | O(n) | better than n × push |
| search | O(n) | heap not sorted for this |
Trie (Prefix Tree)
A Trie stores strings where each node represents a character. Common prefix strings share the same path from the root.
Visual
Insert: "cat", "car", "card", "care", "dog"
root
/ \
c d
| |
a o
/ \ \
t r g
|
(d, e)PythonImplementation
class TrieNode:
def __init__(self):
self.children = {} # char → TrieNode
self.is_end = False # marks end of a word
self.count = 0 # words passing through this node
class Trie:
def __init__(self):
self.root = TrieNode()
# ── Insert ──────────────────────────────────────────────────────
def insert(self, word):
"""Insert word — O(len(word))."""
node = self.root
for char in word:
if char not in node.children:
node.children[char] = TrieNode()
node = node.children[char]
node.count += 1
node.is_end = True
# ── Search ──────────────────────────────────────────────────────
def search(self, word):
"""Return True if exact word exists — O(len(word))."""
node = self._find_prefix_node(word)
return node is not None and node.is_end
def starts_with(self, prefix):
"""Return True if any word starts with prefix — O(len(prefix))."""
return self._find_prefix_node(prefix) is not None
def _find_prefix_node(self, prefix):
node = self.root
for char in prefix:
if char not in node.children:
return None
node = node.children[char]
return node
# ── Delete ──────────────────────────────────────────────────────
def delete(self, word):
"""Delete word from trie — O(len(word))."""
def _delete(node, word, depth):
if node is None:
return False
if depth == len(word):
if node.is_end:
node.is_end = False
return len(node.children) == 0
char = word[depth]
if char not in node.children:
return False
should_delete = _delete(node.children[char], word, depth + 1)
if should_delete:
del node.children[char]
return len(node.children) == 0 and not node.is_end
return False
_delete(self.root, word, 0)
# ── Auto-complete ────────────────────────────────────────────────
def autocomplete(self, prefix):
"""Return all words starting with prefix."""
node = self._find_prefix_node(prefix)
if node is None:
return []
results = []
self._dfs(node, prefix, results)
return results
def _dfs(self, node, current, results):
if node.is_end:
results.append(current)
for char, child in node.children.items():
self._dfs(child, current + char, results)
# ── Count words with prefix ──────────────────────────────────────
def count_words_with_prefix(self, prefix):
node = self._find_prefix_node(prefix)
return node.count if node else 0
# ── All words ───────────────────────────────────────────────────
def all_words(self):
return self.autocomplete("")
# ── Usage ────────────────────────────────────────────────────────────
trie = Trie()
for word in ["cat", "car", "card", "care", "dog", "dot"]:
trie.insert(word)
print(trie.search("car")) # True
print(trie.search("ca")) # False
print(trie.starts_with("ca")) # True
print(trie.autocomplete("ca")) # ['cat', 'car', 'card', 'care']
print(trie.autocomplete("do")) # ['dog', 'dot']
print(trie.all_words()) # all 6 wordsPythonTrie Complexity
| Operation | Time | Space |
|---|---|---|
| Insert | O(m) | O(m) |
| Search | O(m) | O(1) |
| Starts-with | O(m) | O(1) |
| Autocomplete | O(m + k) | O(k) |
| Space total | O(n × m) | — |
m = length of word, n = number of words, k = number of results
Segment Tree
A Segment Tree is a binary tree used to efficiently answer range queries and perform range updates on arrays.
Visual (array = [1, 3, 5, 7, 9, 11])
[0..5] sum=36
/ \
[0..2] sum=9 [3..5] sum=27
/ \ / \
[0..1]=4 [2..2]=5 [3..4]=16 [5..5]=11
/ \ / \
[0..0]=1 [1..1]=3 [3..3]=7 [4..4]=9PythonImplementation
class SegmentTree:
def __init__(self, arr):
self.n = len(arr)
self.tree = [0] * (4 * self.n) # tree array (4x safe size)
if self.n > 0:
self._build(arr, 0, 0, self.n - 1)
# ── Build ────────────────────────────────────────────────────────
def _build(self, arr, node, start, end):
if start == end:
self.tree[node] = arr[start]
else:
mid = (start + end) // 2
self._build(arr, 2 * node + 1, start, mid)
self._build(arr, 2 * node + 2, mid + 1, end)
self.tree[node] = self.tree[2 * node + 1] + self.tree[2 * node + 2]
# ── Range Sum Query ──────────────────────────────────────────────
def query(self, l, r):
"""Sum of elements from index l to r — O(log n)."""
return self._query(0, 0, self.n - 1, l, r)
def _query(self, node, start, end, l, r):
if r < start or end < l:
return 0 # out of range
if l <= start and end <= r:
return self.tree[node] # fully within range
mid = (start + end) // 2
left_sum = self._query(2 * node + 1, start, mid, l, r)
right_sum = self._query(2 * node + 2, mid + 1, end, l, r)
return left_sum + right_sum
# ── Point Update ─────────────────────────────────────────────────
def update(self, idx, val):
"""Update arr[idx] = val — O(log n)."""
self._update(0, 0, self.n - 1, idx, val)
def _update(self, node, start, end, idx, val):
if start == end:
self.tree[node] = val
else:
mid = (start + end) // 2
if idx <= mid:
self._update(2 * node + 1, start, mid, idx, val)
else:
self._update(2 * node + 2, mid + 1, end, idx, val)
self.tree[node] = self.tree[2 * node + 1] + self.tree[2 * node + 2]
# ── Usage ────────────────────────────────────────────────────────────
arr = [1, 3, 5, 7, 9, 11]
st = SegmentTree(arr)
print(st.query(1, 3)) # 3+5+7 = 15
print(st.query(0, 5)) # total = 36
st.update(1, 10) # arr[1] = 10
print(st.query(1, 3)) # 10+5+7 = 22PythonSegment Tree Complexity
| Operation | Time | Space |
|---|---|---|
| Build | O(n) | O(n) |
| Range query | O(log n) | O(1) |
| Point update | O(log n) | O(1) |
N-ary Tree
An N-ary Tree allows each node to have up to N children.
Implementation
class NaryNode:
def __init__(self, val):
self.val = val
self.children = [] # list of NaryNode
class NaryTree:
def __init__(self):
self.root = None
def level_order(self):
"""BFS traversal — returns list of levels."""
if not self.root:
return []
from collections import deque
result, queue = [], deque([self.root])
while queue:
level_size = len(queue)
level = []
for _ in range(level_size):
node = queue.popleft()
level.append(node.val)
for child in node.children:
queue.append(child)
result.append(level)
return result
def preorder(self):
"""Root → children left to right."""
result = []
def _pre(node):
if node:
result.append(node.val)
for child in node.children:
_pre(child)
_pre(self.root)
return result
def postorder(self):
"""Children left to right → root."""
result = []
def _post(node):
if node:
for child in node.children:
_post(child)
result.append(node.val)
_post(self.root)
return result
def height(self, node=None):
if node is None:
node = self.root
if node is None:
return -1
if not node.children:
return 0
return 1 + max(self.height(c) for c in node.children)
def max_depth(self):
return self.height(self.root)
# ── Usage ────────────────────────────────────────────────────────────
# 1
# / | \
# 3 2 4
# / \
# 5 6
root = NaryNode(1)
child3 = NaryNode(3)
child2 = NaryNode(2)
child4 = NaryNode(4)
child3.children = [NaryNode(5), NaryNode(6)]
root.children = [child3, child2, child4]
tree = NaryTree()
tree.root = root
print(tree.preorder()) # [1, 3, 5, 6, 2, 4]
print(tree.postorder()) # [5, 6, 3, 2, 4, 1]
print(tree.level_order()) # [[1], [3, 2, 4], [5, 6]]
print(tree.max_depth()) # 2PythonTree Traversals
All traversals work on any binary tree. Time: O(n), Space: O(h) where h = height.
1. Inorder (Left → Root → Right)
# Recursive
def inorder_recursive(root):
if root is None:
return []
return inorder_recursive(root.left) + [root.val] + inorder_recursive(root.right)
# Iterative (using stack)
def inorder_iterative(root):
result, stack, node = [], [], root
while stack or node:
while node:
stack.append(node)
node = node.left
node = stack.pop()
result.append(node.val)
node = node.right
return result
# Key use: BST → returns sorted orderPython2. Preorder (Root → Left → Right)
# Recursive
def preorder_recursive(root):
if root is None:
return []
return [root.val] + preorder_recursive(root.left) + preorder_recursive(root.right)
# Iterative (using stack)
def preorder_iterative(root):
if not root:
return []
result, stack = [], [root]
while stack:
node = stack.pop()
result.append(node.val)
if node.right:
stack.append(node.right) # right first (LIFO)
if node.left:
stack.append(node.left)
return result
# Key use: copy/clone a tree, serialize treePython3. Postorder (Left → Right → Root)
# Recursive
def postorder_recursive(root):
if root is None:
return []
return postorder_recursive(root.left) + postorder_recursive(root.right) + [root.val]
# Iterative (using two stacks)
def postorder_iterative(root):
if not root:
return []
result, stack = [], [root]
while stack:
node = stack.pop()
result.append(node.val)
if node.left:
stack.append(node.left)
if node.right:
stack.append(node.right)
return result[::-1] # reverse
# Key use: delete a tree, expression evaluationPython4. Level-Order / BFS (Level by Level)
from collections import deque
def level_order(root):
if not root:
return []
result, queue = [], deque([root])
while queue:
level = []
for _ in range(len(queue)):
node = queue.popleft()
level.append(node.val)
if node.left:
queue.append(node.left)
if node.right:
queue.append(node.right)
result.append(level)
return result
# Key use: shortest path, connect level nodes, zigzag traversalPython5. Morris Traversal (O(1) Space Inorder)
def morris_inorder(root):
"""Inorder without extra space by temporarily modifying the tree."""
result, cur = [], root
while cur:
if cur.left is None:
result.append(cur.val)
cur = cur.right
else:
# Find inorder predecessor
prev = cur.left
while prev.right and prev.right != cur:
prev = prev.right
if prev.right is None:
prev.right = cur # make thread
cur = cur.left
else:
prev.right = None # remove thread
result.append(cur.val)
cur = cur.right
return resultPythonTraversal Summary
| Traversal | Order | Use Case |
|---|---|---|
| Inorder | Left → Root → Right | BST sorted output |
| Preorder | Root → Left → Right | Clone tree, prefix expression |
| Postorder | Left → Right → Root | Delete tree, postfix expression |
| Level-order | Level by level | Shortest path, level operations |
| Morris | Inorder, O(1) space | Memory-constrained environments |
Common Tree Problems
1. Maximum Depth / Height
def max_depth(root):
if root is None:
return 0
return 1 + max(max_depth(root.left), max_depth(root.right))Python2. Diameter of Binary Tree
def diameter(root):
"""Longest path between any two nodes (may not pass through root)."""
result = [0]
def height(node):
if not node:
return 0
left_h = height(node.left)
right_h = height(node.right)
result[0] = max(result[0], left_h + right_h)
return 1 + max(left_h, right_h)
height(root)
return result[0]Python3. Lowest Common Ancestor (LCA)
def lca(root, p, q):
"""Find LCA of nodes p and q."""
if root is None or root == p or root == q:
return root
left = lca(root.left, p, q)
right = lca(root.right, p, q)
if left and right:
return root # p and q are on opposite sides
return left or rightPython4. Path Sum
def has_path_sum(root, target):
"""Check if any root-to-leaf path sums to target."""
if root is None:
return False
if root.left is None and root.right is None:
return root.val == target
return (has_path_sum(root.left, target - root.val) or
has_path_sum(root.right, target - root.val))
def path_sum_all(root, target):
"""Return all root-to-leaf paths summing to target."""
result = []
def dfs(node, remaining, path):
if not node:
return
path.append(node.val)
if not node.left and not node.right and remaining == node.val:
result.append(list(path))
dfs(node.left, remaining - node.val, path)
dfs(node.right, remaining - node.val, path)
path.pop()
dfs(root, target, [])
return resultPython5. Mirror / Invert a Binary Tree
def invert_tree(root):
if root is None:
return None
root.left, root.right = invert_tree(root.right), invert_tree(root.left)
return rootPython6. Check if Two Trees are Identical
def is_same_tree(p, q):
if p is None and q is None:
return True
if p is None or q is None:
return False
return (p.val == q.val and
is_same_tree(p.left, q.left) and
is_same_tree(p.right, q.right))Python7. Check Symmetric / Mirror Tree
def is_symmetric(root):
def mirror(left, right):
if not left and not right:
return True
if not left or not right:
return False
return (left.val == right.val and
mirror(left.left, right.right) and
mirror(left.right, right.left))
return mirror(root.left, root.right) if root else TruePython8. Serialize and Deserialize Binary Tree
def serialize(root):
"""Encode tree to string."""
if not root:
return "null"
return f"{root.val},{serialize(root.left)},{serialize(root.right)}"
def deserialize(data):
"""Decode string back to tree."""
nodes = iter(data.split(","))
def build():
val = next(nodes)
if val == "null":
return None
node = TreeNode(int(val))
node.left = build()
node.right = build()
return node
return build()Python9. Right Side View
def right_side_view(root):
"""Return values visible from the right side."""
from collections import deque
if not root:
return []
result, queue = [], deque([root])
while queue:
for i in range(len(queue)):
node = queue.popleft()
if i == 0: # rightmost node of this level
result.append(node.val)
if node.right:
queue.append(node.right)
if node.left:
queue.append(node.left)
return resultPython10. BST: Convert to a Sorted Doubly Linked List
def bst_to_dll(root):
"""Inorder traversal converts BST to sorted DLL in-place."""
head = prev = None
def inorder(node):
nonlocal head, prev
if not node:
return
inorder(node.left)
if prev:
prev.right = node
node.left = prev
else:
head = node
prev = node
inorder(node.right)
inorder(root)
return headPythonTime and Space Complexity
Binary Tree
| Operation | Average | Worst (skewed) | Space |
|---|---|---|---|
| Access | O(n) | O(n) | — |
| Search | O(n) | O(n) | O(h) |
| Insert | O(n) | O(n) | O(h) |
| Delete | O(n) | O(n) | O(h) |
| Traversal (any) | O(n) | O(n) | O(h) |
BST
| Operation | Average | Worst (skewed) |
|---|---|---|
| Search | O(log n) | O(n) |
| Insert | O(log n) | O(n) |
| Delete | O(log n) | O(n) |
| Min / Max | O(log n) | O(n) |
Balanced Trees (AVL / Red-Black)
| Operation | Guaranteed |
|---|---|
| Search | O(log n) |
| Insert | O(log n) |
| Delete | O(log n) |
Heap
| Operation | Time |
|---|---|
| Insert | O(log n) |
| Delete min | O(log n) |
| Peek min | O(1) |
| Build heap | O(n) |
Trie
| Operation | Time | Space |
|---|---|---|
| Insert | O(m) | O(m) |
| Search | O(m) | O(1) |
| Starts-with | O(m) | O(1) |
| All words | O(n×m) | O(n×m) |
m = word length, n = number of words
Segment Tree
| Operation | Time | Space |
|---|---|---|
| Build | O(n) | O(n) |
| Range query | O(log n) | O(1) |
| Point update | O(log n) | O(1) |
Best Practices
1. Always Handle None First
def process(root):
if root is None: # guard clause first
return ...
# main logicPython2. Prefer Iterative for Large Trees (avoid stack overflow)
import sys
sys.setrecursionlimit(10000) # or switch to iterative
# Recursive is fine for balanced trees (depth ~ log n)
# Use iterative + explicit stack for very deep / skewed treesPython3. Choose the Right Tree
# Need sorted data + fast search → BST / AVL
# Need priority access (min/max fast) → Heap
# Need prefix lookups / autocomplete → Trie
# Need range queries on array → Segment Tree
# Need guaranteed O(log n) → AVL / Red-Black
# Need hierarchical multi-child → N-ary TreePython4. BFS vs DFS
# BFS (level-order):
# - Shortest path in unweighted tree
# - Level-by-level processing
# - Uses O(width) space → bad for wide trees
# DFS (inorder/preorder/postorder):
# - Path problems (root-to-leaf)
# - Tree serialization
# - Uses O(height) space → bad for deep/skewed treesPython5. Augment Nodes When Needed
class AugmentedBSTNode:
def __init__(self, val):
self.val = val
self.left = None
self.right = None
self.size = 1 # size of subtree → order statistics
self.height = 1 # height → AVL balancing
self.count = 1 # frequency → handle duplicatesPython6. In-Place vs Extra Space
# Morris traversal → O(1) space inorder
# iterative with explicit stack → O(h) space
# recursive → O(h) call stack space (implicit)PythonQuick Reference Cheat Sheet
Tree Structures at a Glance
═══════════════════════════════════════════════════════
Binary Tree → base structure, each node ≤ 2 children
BST → ordered binary tree, inorder = sorted
AVL Tree → self-balancing BST, O(log n) guaranteed
Red-Black Tree → self-balancing BST, used in std libs
Heap → complete tree, O(1) peek min/max
Trie → character tree, O(m) word ops
Segment Tree → range queries/updates O(log n)
B-Tree → multi-key nodes, used in databases
N-ary Tree → up to N children, file system model
Common Traversals
─────────────────
Inorder L→N→R BST sorted output
Preorder N→L→R clone tree, prefix expr
Postorder L→R→N delete tree, postfix expr
BFS/Level row by row shortest path, level ops
Key Properties
──────────────
Height of balanced tree → O(log n)
Edges in any tree → n - 1
Max nodes, height H → 2^(H+1) - 1
Leaves = internal + 1 (for full binary tree)PythonDiscover more from Altgr Blog
Subscribe to get the latest posts sent to your email.
