Skip to content

Commit 5000feb

Browse files
migrated bundle 04-graph-1
1 parent dbc10fc commit 5000feb

File tree

10 files changed

+377
-0
lines changed

10 files changed

+377
-0
lines changed

docs/graph/binary-search-tree.md

Lines changed: 156 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,156 @@
1+
---
2+
title: Binary Search Tree
3+
tags:
4+
- Tree
5+
- Binary Search
6+
- BST
7+
---
8+
9+
A Binary tree is a tree data structure in which each node has at most two children, which are referred to as the left child and the right child.
10+
11+
For a binary tree to be a binary search tree, the values of all the nodes in the left sub-tree of the root node should be smaller than the root node's value. Also the values of all the nodes in the right sub-tree of the root node should be larger than the root node's value.
12+
13+
<figure markdown="span">
14+
![a simple binary search tree](img/binarytree.png)
15+
<figcaption>a simple binary search tree</figcaption>
16+
</figure>
17+
18+
## Insertion Algorithm
19+
20+
1. Compare values of the root node and the element to be inserted.
21+
2. If the value of the root node is larger, and if a left child exists, then repeat step 1 with root = current root's left child. Else, insert element as left child of current root.
22+
3. If the value of the root node is lesser, and if a right child exists, then repeat step 1 with root = current root's right child. Else, insert element as right child of current root.
23+
24+
## Deletion Algorithm
25+
- Deleting a node with no children: simply remove the node from the tree.
26+
- Deleting a node with one child: remove the node and replace it with its child.
27+
- Node to be deleted has two children: Find inorder successor of the node. Copy contents of the inorder successor to the node and delete the inorder successor.
28+
- Note that: inorder successor can be obtained by finding the minimum value in right child of the node.
29+
30+
## Sample Code
31+
32+
```c
33+
// C program to demonstrate delete operation in binary search tree
34+
#include<stdio.h>
35+
#include<stdlib.h>
36+
37+
struct node
38+
{
39+
int key;
40+
struct node *left, *right;
41+
};
42+
43+
// A utility function to create a new BST node
44+
struct node *newNode(int item)
45+
{
46+
struct node *temp = (struct node *)malloc(sizeof(struct node));
47+
temp->key = item;
48+
temp->left = temp->right = NULL;
49+
return temp;
50+
}
51+
52+
// A utility function to do inorder traversal of BST
53+
void inorder(struct node *root)
54+
{
55+
if (root != NULL)
56+
{
57+
inorder(root->left);
58+
printf("%d ", root->key);
59+
inorder(root->right);
60+
}
61+
}
62+
63+
/* A utility function to insert a new node with given key in BST */
64+
struct node* insert(struct node* node, int key)
65+
{
66+
/* If the tree is empty, return a new node */
67+
if (node == NULL) return newNode(key);
68+
69+
/* Otherwise, recur down the tree */
70+
if (key < node->key)
71+
node->left = insert(node->left, key);
72+
else
73+
node->right = insert(node->right, key);
74+
75+
/* return the (unchanged) node pointer */
76+
return node;
77+
}
78+
79+
/* Given a non-empty binary search tree, return the node with minimum
80+
key value found in that tree. Note that the entire tree does not
81+
need to be searched. */
82+
struct node * minValueNode(struct node* node)
83+
{
84+
struct node* current = node;
85+
86+
/* loop down to find the leftmost leaf */
87+
while (current->left != NULL)
88+
current = current->left;
89+
90+
return current;
91+
}
92+
93+
/* Given a binary search tree and a key, this function deletes the key
94+
and returns the new root */
95+
struct node* deleteNode(struct node* root, int key)
96+
{
97+
// base case
98+
if (root == NULL) return root;
99+
100+
// If the key to be deleted is smaller than the root's key,
101+
// then it lies in left subtree
102+
if (key < root->key)
103+
root->left = deleteNode(root->left, key);
104+
105+
// If the key to be deleted is greater than the root's key,
106+
// then it lies in right subtree
107+
else if (key > root->key)
108+
root->right = deleteNode(root->right, key);
109+
110+
// if key is same as root's key, then This is the node
111+
// to be deleted
112+
else
113+
{
114+
// node with only one child or no child
115+
if (root->left == NULL)
116+
{
117+
struct node *temp = root->right;
118+
free(root);
119+
return temp;
120+
}
121+
else if (root->right == NULL)
122+
{
123+
struct node *temp = root->left;
124+
free(root);
125+
return temp;
126+
}
127+
128+
// node with two children: Get the inorder successor (smallest
129+
// in the right subtree)
130+
struct node* temp = minValueNode(root->right);
131+
132+
// Copy the inorder successor's content to this node
133+
root->key = temp->key;
134+
135+
// Delete the inorder successor
136+
root->right = deleteNode(root->right, temp->key);
137+
}
138+
return root;
139+
}
140+
```
141+
142+
## Time Complexity
143+
144+
The worst case time complexity of search, insert, and deletion operations is $\mathcal{O}(h)$ where h is the height of Binary Search Tree. In the worst case, we may have to travel from root to the deepest leaf node. The height of a skewed tree may become $N$ and the time complexity of search and insert operation may become $\mathcal{O}(N)$. So the time complexity of establishing $N$ node unbalanced tree may become $\mathcal{O}(N^2)$ (for example the nodes are being inserted in a sorted way). But, with random input the expected time complexity is $\mathcal{O}(NlogN)$.
145+
146+
However, you can implement other data structures to establish Self-balancing binary search tree (which will be taught later), popular data structures that implementing this type of tree include:
147+
148+
- 2-3 tree
149+
- AA tree
150+
- AVL tree
151+
- B-tree
152+
- Red-black tree
153+
- Scapegoat tree
154+
- Splay tree
155+
- Treap
156+
- Weight-balanced tree

docs/graph/heap.md

Lines changed: 138 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,138 @@
1+
---
2+
title: Heap
3+
tags:
4+
- Heap
5+
- Priority Queue
6+
---
7+
8+
<figure markdown="span">
9+
![a simple binary search tree](img/360px-Max-Heap.png)
10+
<figcaption>an example max-heap with 9 nodes</figcaption>
11+
</figure>
12+
13+
The heap is a complete binary tree with N nodes, the value of all the nodes in the left and right sub-tree of the root node should be smaller than the root node's value.
14+
15+
In a heap, the highest (or lowest) priority element is always stored at the root. A heap is not a sorted structure and can be regarded as partially ordered. As visible from the heap-diagram, there is no particular relationship among nodes on any given level, even among the siblings. Because a heap is a complete binary tree, it has a smallest possible height. A heap with $N$ nodes has $logN$ height. A heap is a useful data structure when you need to remove the object with the highest (or lowest) priority.
16+
17+
## Implementation
18+
19+
Heaps are usually implemented in an array (fixed size or dynamic array), and do not require pointers between elements. After an element is inserted into or deleted from a heap, the heap property may be violated and the heap must be balanced by internal operations.
20+
21+
The first (or last) element will contain the root. The next two elements of the array contain its children. The next four contain the four children of the two child nodes, etc. Thus the children of the node at position n would be at positions $2*n$ and $2*n + 1$ in a one-based array. This allows moving up or down the tree by doing simple index computations. Balancing a heap is done by sift-up or sift-down operations (swapping elements which are out of order). So we can build a heap from an array without requiring extra memory.
22+
23+
<figure markdown="span">
24+
![example a heap as an array](img/Heap-as-array.png)
25+
<figcaption>example a heap as an array</figcaption>
26+
</figure>
27+
28+
## Insertion
29+
30+
Basically add the new element at the end of the heap. Then look it's parent if it is smaller or bigger depends on the whether it is max-heap or min-heap (max-heap called when Parents are always greater), swap with the parent. If it is swapped do the same operation for the parent.
31+
32+
## Deletion
33+
34+
If you are going to delete a node (root node or another one does not matter),
35+
36+
1. Swap the node to be deleted with the last element of heap to maintain a balanced structure.
37+
2. Delete the last element which is the node we want to delete at the start.
38+
3. Now you have a node which is in the wrong place, You have to find the correct place for the swapped last element, to do this starting point you should check its left and right children, if one them is greater than our node you should swap it with the greatest child(or smallest if it is min-heap).
39+
4. Still current node may in the wrong place, so apply Step 3 as long as it is not greater than its children(or smaller if it is min-heap).
40+
41+
<figure markdown="span" style="width: 40%;">
42+
![](img/heap1.png)
43+
![](img/heap2.png)
44+
<figcaption>an example deletion on a heap structure</figcaption>
45+
</figure>
46+
47+
```py
48+
class BinHeap:
49+
def __init__(self):
50+
self.heapList = [0]
51+
self.currentSize = 0
52+
53+
def percUp(self,i):
54+
while i // 2 > 0:
55+
if self.heapList[i] < self.heapList[i // 2]:
56+
tmp = self.heapList[i // 2]
57+
self.heapList[i // 2] = self.heapList[i]
58+
self.heapList[i] = tmp
59+
i = i // 2
60+
61+
def insert(self,k):
62+
self.heapList.append(k)
63+
self.currentSize = self.currentSize + 1
64+
self.percUp(self.currentSize)
65+
66+
def percDown(self,i):
67+
while (i * 2) <= self.currentSize:
68+
mc = self.minChild(i)
69+
if self.heapList[i] > self.heapList[mc]:
70+
tmp = self.heapList[i]
71+
self.heapList[i] = self.heapList[mc]
72+
self.heapList[mc] = tmp
73+
i = mc
74+
75+
def minChild(self,i):
76+
if i * 2 + 1 > self.currentSize:
77+
return i * 2
78+
else:
79+
if self.heapList[i*2] < self.heapList[i*2+1]:
80+
return i * 2
81+
else:
82+
return i * 2 + 1
83+
84+
def delMin(self):
85+
retval = self.heapList[1]
86+
self.heapList[1] = self.heapList[self.currentSize]
87+
self.currentSize = self.currentSize - 1
88+
self.heapList.pop()
89+
self.percDown(1)
90+
return retval
91+
92+
def buildHeap(self,alist):
93+
i = len(alist) // 2
94+
self.currentSize = len(alist)
95+
self.heapList = [0] + alist[:]
96+
while (i > 0):
97+
self.percDown(i)
98+
i = i - 1
99+
100+
bh = BinHeap()
101+
bh.buildHeap([9,5,6,2,3])
102+
103+
print(bh.delMin())
104+
print(bh.delMin())
105+
print(bh.delMin())
106+
print(bh.delMin())
107+
print(bh.delMin())
108+
```
109+
110+
## Complexity
111+
112+
Insertion $\mathcal{O}(logN)$, delete-min $\mathcal{O}(logN)$ , and finding minimum $\mathcal{O}(1)$. These operations depend on heap's height and heaps are always complete binary trees, basically the height is $logN$. (N is number of Node)
113+
114+
## Priority Queue
115+
Priority queues are a type of container adaptors, specifically designed so that its first element is always the greatest of the elements it contains, according to some strict weak ordering criterion.
116+
117+
While priority queues are often implemented with heaps, they are conceptually distinct from heaps. A priority queue is an abstract concept like "a list" or "a map"; just as a list can be implemented with a linked list or an array, a priority queue can be implemented with a heap or a variety of other methods such as an unordered array.
118+
119+
```cpp
120+
#include <iostream> // std::cout
121+
#include <queue> // std::priority_queue
122+
using namespace std;
123+
int main () {
124+
priority_queue<int> mypq;
125+
126+
mypq.push(30);
127+
mypq.push(100);
128+
mypq.push(25);
129+
mypq.push(40);
130+
131+
cout << "Popping out elements...";
132+
while (!mypq.empty()) {
133+
cout << ' ' << mypq.top();
134+
mypq.pop();
135+
}
136+
return 0;
137+
}
138+
```

docs/graph/img/360px-Max-Heap.png

20.6 KB
Loading

docs/graph/img/Heap-as-array.png

13.1 KB
Loading

docs/graph/img/binary-tree.png

22.6 KB
Loading

docs/graph/img/binarytree.png

15.8 KB
Loading

docs/graph/img/heap1.png

41.4 KB
Loading

docs/graph/img/heap2.png

20.8 KB
Loading

docs/graph/index.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,9 @@ title: Graph
99
### [Introduction](introduction.md)
1010
### [Definitions](definitions.md)
1111
### [Representing Graphs](representing-graphs.md)
12+
### [Tree Traversals](tree-traversals.md)
13+
### [Binary Search Tree](./binary-search-tree.md)
14+
### [Heap](heap.md)
1215
### [Depth First Search](depth-first-search.md)
1316
### [Breadth First Search](breadth-first-search.md)
1417
### [Cycle Finding](cycle-finding.md)

docs/graph/tree-traversals.md

Lines changed: 80 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,80 @@
1+
---
2+
title: Tree Traversals
3+
tags:
4+
- Tree
5+
- Preorder
6+
- Postorder
7+
- Inorder
8+
---
9+
10+
The tree traversal is the process of visiting every node exactly once in a tree structure for some purposes(like getting information or updating information). In a binary tree there are some described order to travel, these are specific for binary trees but they may be generalized to other trees and even graphs as well.
11+
12+
<figure markdown="span">
13+
![a binary tree](img/binary-tree.png)
14+
<figcaption>a binary tree</figcaption>
15+
</figure>
16+
17+
## Preorder Traversal
18+
19+
Preorder means that a root will be evaluated before its children. In other words the order of evaluation is: Root-Left-Right
20+
21+
```
22+
Preorder Traversal
23+
Look Data
24+
Traverse the left node
25+
Traverse the right node
26+
```
27+
28+
Example: 50 – 7 – 3 – 2 – 8 – 16 – 5 – 12 – 17 – 54 – 9 – 13
29+
30+
## Inorder Traversal
31+
Inorder means that the left child (and all of the left child’s children) will be evaluated before the root and before the right child and its children. Left-Root-Right (by the way, in binary search tree inorder retrieves data in sorted order)
32+
33+
```
34+
Inorder Traversal
35+
Traverse the left node
36+
Look Data
37+
Traverse the right node
38+
```
39+
40+
Example: 2 – 3 – 7 – 16 – 8 – 50 – 12 – 54 – 17 – 5 – 9 – 13
41+
42+
## Postorder Traversal
43+
Postorder is the opposite of preorder, all children are evaluated before their root: Left-Right-Root
44+
45+
```
46+
Postorder Traversal
47+
Traverse the left node
48+
Traverse the right node
49+
Look Data
50+
```
51+
52+
Example: 2 – 3 – 16 – 8 – 7 – 54 – 17 – 12 – 13 – 9 – 5 – 50
53+
54+
## Implementation
55+
56+
```py
57+
class Node:
58+
def __init__(self,key):
59+
self.left = None
60+
self.right = None
61+
self.val = key
62+
63+
def printInorder(root):
64+
if root:
65+
printInorder(root.left)
66+
print(root.val)
67+
printInorder(root.right)
68+
69+
def printPostorder(root):
70+
if root:
71+
printPostorder(root.left)
72+
printPostorder(root.right)
73+
print(root.val)
74+
75+
def printPreorder(root):
76+
if root:
77+
print(root.val)
78+
printPreorder(root.left)
79+
printPreorder(root.right)
80+
```

0 commit comments

Comments
 (0)