Skip to content

Commit a0f21a2

Browse files
committedMar 26, 2019
update indexes
1 parent 9eb2bfb commit a0f21a2

27 files changed

+160
-104
lines changed
 

‎book/chapters/array.adoc

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,6 @@
11
= Array
2-
2+
(((Array)))
3+
(((Data Structures, Linear, Array)))
34
Arrays are one of the most used data structures. You probably have used it a lot but are you aware of the runtimes of `splice`, `shift` and other operations? In this chapter, we are going deeper into the most common operations and their runtimes.
45

56
== Array Basics
@@ -184,4 +185,5 @@ To sum up, the time complexity on an array is:
184185
^|_Index/Key_ ^|_Value_ ^|_beginning_ ^|_middle_ ^|_end_ ^|_beginning_ ^|_middle_ ^|_end_
185186
| Array ^|O(1) ^|O(n) ^|O(n) ^|O(n) ^|O(1) ^|O(n) ^|O(n) ^|O(1) ^|O(n)
186187
|===
187-
indexterm:[Runtime, Linear]
188+
(((Linear)))
189+
(((Runtime, Linear)))

‎book/chapters/big-o-examples.adoc

Lines changed: 16 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,8 @@ image:image5.png[CPU time needed vs. Algorithm runtime as the input size increas
2323
The above chart shows how the running time of an algorithm is related to the amount of work the CPU has to perform. As you can see O(1) and O(log n) are very scalable. However, O(n^2^) and worst can make your computer run for years [big]#😵# on large datasets. We are going to give some examples so you can identify each one.
2424

2525
== Constant
26-
26+
(((Constant)))
27+
(((Runtime, Constant)))
2728
Represented as *O(1)*, it means that regardless of the input size the number of operations executed is always the same. Let’s see an example.
2829

2930
[#constant-example]
@@ -44,7 +45,8 @@ Another more real life example is adding an element to the begining of a <<Linke
4445
As you can see, in both examples (array and linked list) if the input is a collection of 10 elements or 10M it would take the same amount of time to execute. You can't get any more performance than this!
4546

4647
== Logarithmic
47-
48+
(((Logarithmic)))
49+
(((Runtime, Logarithmic)))
4850
Represented in Big O notation as *O(log n)*, when an algorithm has this running time it means that as the size of the input grows the number of operations grows very slowly. Logarithmic algorithms are very scalable. One example is the *binary search*.
4951
indexterm:[Runtime, Logarithmic]
5052

@@ -65,7 +67,8 @@ This binary search implementation is a recursive algorithm, which means that the
6567
Finding the runtime of recursive algorithms is not very obvious sometimes. It requires some tools like recursion trees or the https://adrianmejia.com/blog/2018/04/24/analysis-of-recursive-algorithms/[Master Theorem]. The `binarySearch` divides the input in half each time. As a rule of thumb, when you have an algorithm that divides the data in half on each call you are most likely in front of a logarithmic runtime: _O(log n)_.
6668

6769
== Linear
68-
70+
(((Linear)))
71+
(((Runtime, Linear)))
6972
Linear algorithms are one of the most common runtimes. It’s represented as *O(n)*. Usually, an algorithm has a linear running time when it iterates over all the elements in the input.
7073

7174
[#linear-example]
@@ -90,7 +93,8 @@ As we learned before, the big O cares about the worst-case scenario, where we wo
9093
Space complexity is also *O(n)* since we are using an auxiliary data structure. We have a map that in the worst case (no duplicates) it will hold every word.
9194

9295
== Linearithmic
93-
96+
(((Linearithmic)))
97+
(((Runtime, Linearithmic)))
9498
An algorithm with a linearithmic runtime is represented as _O(n log n)_. This one is important because it is the best runtime for sorting! Let’s see the merge-sort.
9599

96100
[#linearithmic-example]
@@ -125,8 +129,8 @@ image:image11.png[Mergesort visualization,width=500,height=600]
125129
How do we obtain the running time of the merge sort algorithm? The mergesort divides the array in half each time in the split phase, _log n_, and the merge function join each splits, _n_. The total work we have *O(n log n)*. There more formal ways to reach to this runtime like using the https://adrianmejia.com/blog/2018/04/24/analysis-of-recursive-algorithms/[Master Method] and https://www.cs.cornell.edu/courses/cs3110/2012sp/lectures/lec20-master/lec20.html[recursion trees].
126130

127131
== Quadratic
128-
129-
indexterm:[Runtime, Quadratic]
132+
(((Quadratic)))
133+
(((Runtime, Quadratic)))
130134
Running times that are quadratic, O(n^2^), are the ones to watch out for. They usually don’t scale well when they have a large amount of data to process.
131135

132136
Usually, they have double-nested loops that where each one visits all or most elements in the input. One example of this is a naïve implementation to find duplicate words on an array.
@@ -149,7 +153,8 @@ As you can see, we have two nested loops causing the running time to be quadrati
149153
Let’s say you want to find a duplicated middle name in a phone directory book of a city of ~1 million people. If you use this quadratic solution you would have to wait for ~12 days to get an answer [big]#🐢#; while if you use the <<Linear, linear solution>> you will get the answer in seconds! [big]#🚀#
150154

151155
== Cubic
152-
156+
(((Cubic)))
157+
(((Runtime, Cubic)))
153158
Cubic *O(n^3^)* and higher polynomial functions usually involve many nested loops. As an example of a cubic algorithm is a multi-variable equation solver (using brute force):
154159

155160
[#cubic-example]
@@ -174,7 +179,8 @@ WARNING: This just an example, there are better ways to solve multi-variable equ
174179
As you can see three nested loops usually translates to O(n^3^). If you have a four variable equation and four nested loops it would be O(n^4^) and so on when we have a runtime in the form of _O(n^c^)_, where _c > 1_, we can refer as a *polynomial runtime*.
175180

176181
== Exponential
177-
182+
(((Exponential)))
183+
(((Runtime, Exponential)))
178184
Exponential runtimes, O(2^n^), means that every time the input grows by one the number of operations doubles. Exponential programs are only usable for a tiny number of elements (<100) otherwise it might not finish on your lifetime. [big]#💀#
179185

180186
Let’s do an example.
@@ -203,7 +209,8 @@ include::{codedir}/runtimes/07-sub-sets.js[tag=snippet]
203209
Every time the input grows by one the resulting array doubles. That’s why it has an *O(2^n^)*.
204210

205211
== Factorial
206-
212+
(((Factorial)))
213+
(((Runtime, Factorial)))
207214
Factorial runtime, O(n!), is not scalable at all. Even with input sizes of ~10 elements, it will take a couple of seconds to compute. It’s that slow! [big]*🍯🐝*
208215

209216
.Factorial

‎book/chapters/bubble-sort.adoc

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,8 @@ Bubble sort is a simple sorting algorithm that "bubbles up" the biggest values t
77
It's also called _sinking sort_ because the most significant values "sink" to the right side of the array.
88
This algorithm is adaptive, which means that if the array is already sorted, it will take only _O(n)_ to "sort".
99
However, if the array is entirely out of order, it will require _O(n^2^)_ to sort.
10+
(((Quadratic)))
11+
(((Runtime, Quadratic)))
1012

1113
== Bubble Sort Implementation
1214

‎book/chapters/chapter3.adoc

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -11,20 +11,25 @@ include::tree.adoc[]
1111

1212

1313
// (g)
14+
<<<
1415
include::tree--binary-search-tree.adoc[]
1516

17+
<<<
1618
include::tree--search.adoc[]
1719

20+
<<<
1821
include::tree--self-balancing-rotations.adoc[]
1922

2023
:leveloffset: +1
2124

25+
<<<
2226
include::tree--avl.adoc[]
2327

2428
:leveloffset: -1
2529

2630
// (g)
2731
// include::map.adoc[]
32+
<<<
2833
include::map-intro.adoc[]
2934

3035
:leveloffset: +1

‎book/chapters/divide-and-conquer--fibonacci.adoc

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -52,7 +52,8 @@ graph G {
5252
....
5353

5454
In the diagram, we see the two recursive calls needed to compute each number. So if we follow the _O(branches^depth^)_ we get O(2^n^). [big]#🐢#
55-
55+
(((Exponential)))
56+
(((Runtime, Exponential)))
5657
NOTE: Fibonacci is not a perfect binary tree since some nodes only have one child instead of two. The exact runtime for recursive Fibonacci is _O(1.6^n^)_ (still exponential time complexity).
5758

5859
Exponential time complexity is pretty bad. Can we do better?

‎book/chapters/dynamic-programming--fibonacci.adoc

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,8 @@ graph G {
2323
....
2424

2525
This graph looks pretty linear now. It's runtime _O(n)_!
26-
indexterm:[Runtime, Linear]
26+
(((Linear)))
27+
(((Runtime, Linear)))
2728

2829
(((Memoization)))
2930
TIP: Saving previous results for later is a technique called "memoization". This is very common to optimize recursive algorithms with overlapping subproblems. It can make exponential algorithms linear!

‎book/chapters/graph.adoc

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,6 @@
11
= Graph
2-
2+
(((Graph)))
3+
(((Data Structures, Non-Linear, Graph)))
34
Graphs are one of my favorite data structures.
45
They have a lot of cool applications like optimizing routes, social network analysis to name a few. You are probably using apps that use graphs every day.
56
First, let’s start with the basics.

‎book/chapters/insertion-sort.adoc

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -31,4 +31,6 @@ include::{codedir}/algorithms/sorting/insertion-sort.js[tag=sort, indent=0]
3131
- <<Adaptive>>: [big]#✅# Yes
3232
- Time Complexity: [big]#⛔️# <<Quadratic>> _O(n^2^)_
3333
- Space Complexity: [big]#✅# <<Constant>> _O(1)_
34-
indexterm:[Runtime, Quadratic]
34+
35+
(((Quadratic)))
36+
(((Runtime, Quadratic)))

‎book/chapters/linked-list.adoc

Lines changed: 5 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,7 @@
11
= Linked List
2-
2+
(((Linked List)))
3+
(((List)))
4+
(((Data Structures, Linear, Linked List)))
35
A list (or Linked List) is a linear data structure where each node is linked to another one.
46

57
Linked Lists can be:
@@ -248,8 +250,9 @@ So far, we have seen two liner data structures with different use cases. Here’
248250
| Linked List (singly) ^|O(n) ^|O(n) ^|O(1) ^|O(n) ^|O(1) ^|O(1) ^|O(n) ^|*O(n)* ^|O(n)
249251
| Linked List (doubly) ^|O(n) ^|O(n) ^|O(1) ^|O(n) ^|O(1) ^|O(1) ^|O(n) ^|*O(1)* ^|O(n)
250252
|===
253+
(((Linear)))
254+
(((Runtime, Linear)))
251255

252-
indexterm:[Runtime, Linear]
253256
If you compare the singly linked list vs. doubly linked list, you will notice that the main difference is deleting elements from the end. For a singly list is *O(n)*, while for a doubly list is *O(1)*.
254257

255258
Comparing an array with a doubly linked list, both have different use cases:

‎book/chapters/map-hashmap-vs-treemap.adoc

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,4 +26,7 @@ As we discussed so far, there are trade-off between the implementations
2626
|===
2727
{empty}* = Amortized run time. E.g. rehashing might affect run time to *O(n)*.
2828

29-
indexterm:[Runtime, Logarithmic]
29+
(((Linear)))
30+
(((Runtime, Linear)))
31+
(((Logarithmic)))
32+
(((Runtime, Logarithmic)))

‎book/chapters/map-hashmap.adoc

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,7 @@
11
= HashMap
2-
2+
(((HashMap)))
3+
(((HashTable)))
4+
(((Data Structures, Non-Linear, HashMap)))
35
A HashMap is a Map implementation. HashMaps are composed of two things:
46
1) a hash function and
57
2) a bucket array to store values.

‎book/chapters/map-intro.adoc

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,6 @@
11
= Map
2-
2+
(((Map)))
3+
(((Data Structures, Non-Linear, Map)))
34
A map is a data structure to store pairs of data: *key* and *value*. In an array, you can only store values. The array’s key is always the position index. However, in a *Map* the key can be whatever you want.
45

56
IMPORTANT: Map is a data structure that _maps_ *keys* to *values*.

‎book/chapters/map-treemap.adoc

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,6 @@
11
= TreeMap
2-
2+
(((TreeMap)))
3+
(((Data Structures, Non-Linear, TreeMap)))
34
A TreeMap is a Map implementation using Binary Search Trees.
45

56
Implementing a Map with a tree, TreeMap, has a couple of advantages over a HashMap:

‎book/chapters/merge-sort.adoc

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -50,3 +50,7 @@ Merge sort has an _O(n log n)_ running time. For more details about how to extr
5050
- Recursive: Yes
5151
- Time Complexity: [big]#✅# <<Linearithmic>> _O(n log n)_
5252
- Space Complexity: [big]#⚠️# <<Linear>> _O(n)_, use auxiliary memory
53+
54+
(((Linearithmic)))
55+
(((Runtime, Linearithmic)))
56+
(((Space complexity, Linear)))

‎book/chapters/non-linear-data-structures-intro.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
[partintro]
22
--
3-
Non-Linear data structures are everywhere whether you realize it or not. They are used in databases, Web (HTML DOM tree), search algorithms, finding the best route to get home and so on. We are going to learn the basic concepts and when to choose one over the other.
3+
Non-Linear data structures are everywhere whether you realize it or not. You can find them in databases, Web (HTML DOM tree), search algorithms, finding the best route to get home and many more uses. We are going to learn the basic concepts and when to choose one over the other.
44

55
.In this chapter we are going to learn:
66
- Exciting <<Graph>> data structure applications

‎book/chapters/queue.adoc

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,6 @@
11
= Queue
2-
2+
(((Queue)))
3+
(((Data Structures, Linear, Queue)))
34
A queue is a linear data structure where the data flows in a *First-In-First-Out* (FIFO) manner.
45

56
.Queue data structure is like a line of people: the First-in, is the First-out

‎book/chapters/quick-sort.adoc

Lines changed: 5 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -77,8 +77,11 @@ With the optimization, Quicksort has an _O(n log n)_ running time. Similar to th
7777
- <<Online>>: [big]#️❌# No, the pivot element can be choose at random.
7878
- Recursive: Yes
7979
- Time Complexity: [big]#✅# <<Linearithmic>> _O(n log n)_
80-
- Space Complexity: [big]#⚠️# <<Linear>> _O(n)_
81-
indexterm:[Space Complexity, Linear]
80+
- Space Complexity: [big]#✅# <<Constant>> _O(1)_
81+
82+
(((Linearithmic)))
83+
(((Runtime, Linearithmic)))
84+
(((Space complexity, Constant)))
8285

8386
// Resources:
8487
// https://www.khanacademy.org/computing/computer-science/algorithms/quick-sort/a/linear-time-partitioning

‎book/chapters/selection-sort.adoc

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -51,4 +51,5 @@ There you have it, `2b` now comes before `2a`.
5151

5252
// CAUTION: In practice, selection sort performance is the worst compared <<Bubble Sort>> and <<Insertion Sort>>. The only advantage of selection sort is that it minimizes the number of swaps. In case, that swapping is expensive, then it could make sense to use this one over the others.
5353

54-
indexterm:[Runtime, Quadratic]
54+
(((Quadratic)))
55+
(((Runtime, Quadratic)))

‎book/chapters/set.adoc

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,6 @@
11
= Set
2-
2+
(((Set)))
3+
(((Data Structures, Non-Linear, Set)))
34
A set is a data structure where duplicated entries are not allowed. Set is like an array with unique values.
45

56
NOTE: JavaScript has already a built-in Set data structure.
@@ -212,6 +213,8 @@ rehash happens, it will take *O(n)* instead of *O(1)*. A `TreeSet` is always *O(
212213
{empty}* = Amortized run time. E.g. rehashing might affect run time to *O(n)*.
213214

214215
indexterm:[Runtime, Linear]
216+
(((Logarithmic)))
217+
(((Runtime, Logarithmic)))
215218
To recap, HashSet and TreeSet will keep data without duplicates. The
216219
difference besides runtime is that:
217220

‎book/chapters/stack.adoc

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,6 @@
11
= Stack
2-
2+
(((Stack)))
3+
(((Data Structures, Linear, Stack)))
34
The stack is a data structure that restricts the way you add and remove data. It only allows you to insert and retrieve in a *Last-In-First-Out* (LIFO) fashion.
45

56
An analogy is to think the stack is a rod and the data are discs. You can only take out the last one you put in.

‎book/chapters/tree--avl.adoc

Lines changed: 12 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -1,22 +1,23 @@
11
= AVL Tree
2-
2+
(((AVL Tree)))
3+
(((Tree, AVL)))
34
AVL Tree is named after their inventors (**A**delson-**V**elsky and **L**andis).
4-
This self-balancing tree keep track of subtree sizes to know if a rebalance is needed or not.
5+
This self-balancing tree keeps track of subtree sizes to know if a rebalance is needed or not.
56
We can compare the size of the left and right subtrees using a balance factor.
67

78
[NOTE]
89
====
910
10-
The *balanced factor* on each node is calculated recurviely as follows:
11+
The *balanced factor* on each node is calculated recursively as follows:
1112
1213
----
1314
Balance Factor = (left subtree height) - (right subtree height)
1415
----
1516
1617
====
1718

18-
The implementation will got into the BST node.
19-
We will need two methods to calculate the left and right subtree, and with those we can get the balance factor.
19+
The implementation will go in the BST node class.
20+
We will need two methods to calculate the left and right subtree, and with those, we can get the balance factor.
2021

2122
.Balance Factor methods on the BST node
2223
[source, javascript]
@@ -27,25 +28,25 @@ include::{codedir}/data-structures/trees/binary-tree-node.js[tag=avl, indent=0]
2728

2829
== Implementing AVL Tree
2930

30-
Implementing an AVL Tree is not too hard, since it builds upon what we did in the Binary Search Tree.
31+
Implementing an AVL Tree is not too hard since it builds upon what we did in the Binary Search Tree.
3132

3233
.AVL Tree class
3334
[source, javascript]
3435
----
3536
include::{codedir}/data-structures/trees/avl-tree.js[tag=AvlTree]
3637
----
3738

38-
As you can see, AVL tree inherits from the BST class.
39-
The insert and remove operations works the same as in the BST, except that at the end we call `balanceUptream`.
40-
This function makes balance the tree after every change if is needed. Let's see how it's implemented.
39+
As you can see, the AVL tree inherits from the BST class.
40+
The insert and remove operations work the same as in the BST, except that at the end we call `balanceUpstream`.
41+
This function checks if the tree is symmetrical after every change to the tree. If the tree went out of balance, it would execute the appropriated rotation to fix it.
4142

4243
.Balance Upstream for AVL tree
4344
[source, javascript]
4445
----
45-
include::{codedir}/data-structures/trees/avl-tree.js[tag=balanceUptream]
46+
include::{codedir}/data-structures/trees/avl-tree.js[tag=balanceUpstream]
4647
----
4748

48-
This function recurively goes from the modified node to the root checking if each node in between is balanced.
49+
This function recursively goes from the modified node to the root checking if each node in between is balanced.
4950
Now, let's examine how does the balancing works on AVL tree.
5051

5152
.Balance method for AVL tree

‎book/chapters/tree--binary-search-tree.adoc

Lines changed: 11 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,12 @@
11
= Binary Search Tree
2-
3-
.The Binary Search Tree (BST) is a tree data structure that keeps the following constraints:
2+
(((Binary Search Tree)))
3+
(((BST)))
4+
(((Data Structures, Non-Linear, Binary Search Tree)))
5+
.To recap, the Binary Search Tree (BST) is a tree data structure that keeps the following constraints:
46
* Each node must have at most two children. Usually referred to as "left" and "right".
57
* All trees must a have a "root" node.
6-
* The order of nodes values must be: left child < parent < right child.
7-
* Nodes might need re-ordering after each insert/delete operation to keep the `left < parent < right` constraint.
8+
* The order of nodes values must be: `left child < parent < right child`.
9+
* Nodes might need re-ordering after each insert/delete operation to keep the `left <= parent < right` constraint.
810

911
== Implementing a Binary Search Tree
1012

@@ -42,8 +44,8 @@ With the methods `add` and `remove` we have to guarantee that our tree always ha
4244
=== Inserting new elements in a BST
4345

4446
.For inserting an element, in a BST, we have two scenarios:
45-
1. If the tree is empty (root element is null), we add the newly created node as root, and we are done!
46-
2. If the root is not null. Start from the root, then compare the node’s value against the new element. If the node has higher than a new item, we move to the right child, otherwise to the left. We check each node recursively until we find an empty spot where we can put the new element and keep the rule `right < parent < left`.
47+
1. If the tree is empty (root element is null), we add the newly created node as root, and that's it!
48+
2. If the root is not null. Start from it and compare the node’s value against the new element. If the node has higher than a new item, we move to the right child, otherwise to the left. We check each node recursively until we find an empty spot where we can put the new element and keep the rule `right < parent < left`.
4749
3. If we insert the same value multiple times, we don’t want duplicates. So, we can keep track of multiples using a duplicity counter.
4850

4951
For instance, let’s say that we want to insert the values 19, 21, 10, 2, 8 in a BST:
@@ -84,7 +86,7 @@ Deleting a node from a BST have three cases.
8486

8587
==== Removing a leaf (Node with 0 children)
8688

87-
Deleting a leaf is the easiest, we look for their parent and set the child to null.
89+
Deleting a leaf is the easiest; we look for their parent and set the child to null.
8890

8991
.Removing node without children from a BST.
9092
image:image37.png[image,width=528,height=200]
@@ -110,7 +112,7 @@ Removing a parent of two children is the trickiest of all cases because we need
110112
image:image39.png[image,width=528,height=404]
111113

112114

113-
In the example, we delete the root node 19. This leaves the two orphans (node 10 and node 21). There are no more parents because node 19 was the *root* element. One way to solve this problem is to *combine* the left subtree (Node 10 and descendants) into the right subtree (node 21). The final result is node 21 is the new root.
115+
In the example, we delete the root node 19. This deletion leaves two orphans (node 10 and node 21). There are no more parents because node 19 was the *root* element. One way to solve this problem is to *combine* the left subtree (Node 10 and descendants) into the right subtree (node 21). The final result is node 21 is the new root.
114116

115117
What would happen if node 21 had a left child (e.g., node 20)? Well, we would move node 10 and its descendants' bellow node 20.
116118

@@ -126,7 +128,7 @@ include::{codedir}/data-structures/trees/binary-search-tree.js[tag=remove, inden
126128
<1> Try to find if the value exists on the tree.
127129
<2> If the value doesn’t exist we are done!
128130
<3> Create new subtree without the value to delete
129-
<4> Check the multiplicity (duplicates) and decrement the count in case we have multiple nodes with the same value
131+
<4> Check the multiplicity (duplicates) and decrement the count if we have multiple nodes with the same value
130132
<5> If the `nodeToRemove` was the root, then we move the removed node’s children as the new root.
131133
<6> If it was not the root, then we go to the deleted node’s parent and put their children there.
132134

‎book/chapters/tree--binary-tree-traversal.adoc

Lines changed: 13 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,11 @@
11
= Binary Tree Traversal
2-
3-
As mentioned before, there are different ways to visit all the nodes or search for a value in a binary tree. On this section we are going to focus on depth-first tree traversal. The implementations are recursive since it's more elegant and concise. Let's explore them.
2+
(((Binary Tree Traversal)))
3+
As mentioned before, there are different ways to visit all the nodes or search for a value in a binary tree. On this section, we are going to focus on depth-first tree traversal.
44

55
== In Order Traversal
6-
7-
If you tree happens to be a binary search tree (BST), then could use "in order" traversal to get the values sorted in ascending order. To accomplish this, you have to visit the nodes in a `left-root-right` order.
6+
(((Tree Traversal, In Order)))
7+
(((In Order Traversal)))
8+
If your tree happens to be a binary search tree (BST), then you can use "in order" traversal to get the values sorted in ascending order. To accomplish this, you have to visit the nodes in a `left-root-right` order.
89

910
If we have the following tree:
1011
----
@@ -27,11 +28,12 @@ Check out the implementation:
2728
include::{codedir}/data-structures/trees/binary-search-tree.js[tag=inOrderTraversal, indent=0]
2829
----
2930

30-
This function goes recursively to the leftmost element and then yield that node, then we go to the right child (if any) and repeat the process. This will get us the values ordered.
31+
This function goes recursively to the leftmost element and then yield that node, then we go to the right child (if any) and repeat the process. This method will get us the values ordered.
3132

3233
== Pre Order Traversal
33-
34-
Pre-order traveral visits nodes in this order `root-left-right` recursively.
34+
(((Tree Traversal, Pre Order)))
35+
(((Pre Order Traversal)))
36+
Pre-order traversal visits nodes in this order `root-left-right` recursively.
3537

3638
.Usage of pre-order traversal:
3739
- Create a copy of the tree.
@@ -58,10 +60,11 @@ If we have the following tree:
5860
Pre-order traverval will return `10, 5, 4, 3, 30, 15, 40`.
5961

6062
== Post-order Traversal
63+
(((Tree Traversal, Post Order)))
64+
(((Post Order Traversal)))
65+
Post-order traversal goes to each node in this order `left-right-root` recursively.
6166

62-
Post-order traveral goes to each node in this order `left-right-root` recursively.
63-
64-
.Usages of the post-order tree traveral
67+
.Usages of the post-order tree traversal
6568
- Traversal is used to delete the tree because you visit the children before removing the parent.
6669
- Get the postfix expression of an expression tree used in the http://en.wikipedia.org/wiki/Reverse_Polish_notation[reverse polish notation].
6770

‎book/chapters/tree--search.adoc

Lines changed: 22 additions & 22 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
= Tree Search & Traversal
22

33
So far we covered, how to insert/delete/search values in a binary search tree (BST).
4-
However, not all binary trees are BST, so there are other ways to look for values or visit all nodes in a certain order.
4+
However, not all binary trees are BST, so there are other ways to look for values or visit all nodes in a particular order.
55

66
If we have the following tree:
77
----
@@ -15,22 +15,25 @@ If we have the following tree:
1515
----
1616
1717
Depending on what traversal methods we used we will have a different visiting order.
18-
18+
(((Tree Traversal)))
19+
(((Tree, Traversal)))
1920
.Tree traversal methods
2021
- Breadth-first traversal (a.k.a level order traversal): `10, 5, 30, 4, 15, 40, 3`
2122
- Depth-first traversal
2223
** In-order (left-root-right): `3, 4, 5, 10, 15, 30, 40`
2324
** Pre-order (root-left-right): `10, 5, 4, 3, 30, 15, 40`
2425
** Post-order (left-right-root): `3, 4, 5, 15, 40, 30, 10`
2526
26-
Why do we care? Well, there are certain problems that you solve more optimally using one or another traversal method. For instance to get the size of a subtree, finding maximums/minimums, and so on.
27+
Why do we care? Well, there are specific problems that you can solve more optimally using one or another traversal method. For instance to get the size of a subtree, finding maximums/minimums, and so on.
2728
28-
Let's cover Breadth-first search (BFS) and Depth-first search (DFS).
29+
Let's cover the Breadth-first search (BFS) and Depth-first search (DFS).
2930
3031
[Breadth First Search]
3132
== Breadth-First Search for Binary Tree
32-
33-
Breadth-first search goeas wide (breadth) before going deep. Hence, the name. In other words, it goes level by level. It visits all the inmediate nodes or children and then move on to the children's children.
33+
(((BFS)))
34+
(((Breadth-First Search)))
35+
(((Tree, Breadth-First Search)))
36+
The breadth-first search goes wide (breadth) before going deep. Hence, the name. In other words, it goes level by level. It visits all the immediate nodes or children and then moves on to the children's children.
3437
Let's how can we implement it!
3538
3639
.Breath-First Search (BFS) Implementation
@@ -41,12 +44,14 @@ include::{codedir}/data-structures/trees/binary-search-tree.js[tag=bfs,indent=0]
4144
4245
As you see, the BFS uses a <<Queue>> data structure. We enqueue all the children of the current node and then dequeue them as we visit them.
4346
44-
Note the asterisk (`*`) in front of the function means that this function is a generator that yield values.
47+
Note the asterisk (`*`) in front of the function means that this function is a generator that yields values.
4548
49+
(((JavaScript Notes, Generators)))
50+
(((Generators)))
4651
.JavaScript Generators
4752
****
4853
49-
JavaScript generators were added as part of ES6, they allow process possibly expensive operations one by one. You can convert any function into a generator by adding the asterisk in front and `yield`ing a value.
54+
JavaScript generators were added as part of ES6; they allow process possibly expensive operations one by one. You can convert any function into a generator by adding the asterisk in front and `yield`ing a value.
5055
5156
Then you can use `next()` to get the value and also `done` to know if it's the last value. Here are some examples:
5257
@@ -79,27 +84,29 @@ console.log(Array.from(dummyIdMaker())); // [0, 1, 2]
7984
8085
8186
== Depth-First Search for Binary Tree
82-
83-
Depth-First search goes deep before going wide. It means, that starting for the root it goes as deep as it can until it found a leaf node (node without children), then it visits all the remaing nodes that were in the path.
87+
(((DFS)))
88+
(((Depth-First Search)))
89+
(((Tree, Depth-First Search)))
90+
Depth-First search goes deep (depth) before going wide. It means that starting for the root it goes as deep as it can until it found a leaf node (node without children), then it visits all the remaining nodes that were in the path.
8491
8592
.Depth-First Search (DFS) Implementation with a Stack
8693
[source, javascript]
8794
----
8895
include::{codedir}/data-structures/trees/binary-search-tree.js[tag=dfs,indent=0]
8996
----
9097
91-
This is a iterative implementation of a DFS using an <<Stack>>.
92-
It's almost identical to the BFS but instead of using a <<Queue>> we usa a Stack.
98+
This is an iterative implementation of a DFS using an <<Stack>>.
99+
It's almost identical to the BFS, but instead of using a <<Queue>> we use a Stack.
93100
We can also implement it as recursive functions are we are going to see in the <<Binary Tree Traversal>> section.
94101
95102
== Depth-First Search vs. Breadth-First Search
96103
97-
We can see visually the difference on how the DFS and BFS search for nodes:
104+
We can see visually the difference between how the DFS and BFS search for nodes:
98105
99106
.Depth-First Search vs. Breadth-First Search
100107
image:depth-first-search-dfs-breadth-first-search-bfs.jpg[]
101108
102-
As you can see the DFS in two iterations is already at one of the farthest node from the root while BFS search nearby nodes first.
109+
As you can see the DFS in two iterations is already at one of the farthest nodes from the root while BFS search nearby nodes first.
103110
104111
.Use DFS when:
105112
- The node you are looking for is likely to be *far* from the root.
@@ -108,14 +115,7 @@ As you can see the DFS in two iterations is already at one of the farthest node
108115
- The node you are looking for is *nearby* the root.
109116
110117
:leveloffset: +1
111-
118+
<<<
112119
include::tree--binary-tree-traversal.adoc[]
113120
114121
:leveloffset: -1
115-
116-
117-
118-
119-
120-
121-

‎book/chapters/tree--self-balancing-rotations.adoc

Lines changed: 12 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,11 @@
11
= Self-balancing Binary Search Trees
22

3-
Binary Search Trees (BST) are a great data structure to find elements very fast _O(n log n)_.
4-
However, when the BST branches has different branch sizes then the performance suffers.
5-
In the worst case, all nodes can go to one side (e.g. right) and then the search time would be linear.
3+
Binary Search Trees (BST) are an excellent data structure to find elements very fast _O(log n)_.
4+
However, when the BST branches have different branch sizes, then the performance suffers.
5+
In the worst case, all nodes can go to one side (e.g., right) and then the search time would be linear.
66
At this point searching element won't be any better on that tree than an array or linked list. Yikes!
77

8-
Self-balanced trees will automatically balanced the tree when an element is inserted to keep search performace.
8+
Self-balanced trees will automatically rebalance the tree when an element is inserted to keep search performance.
99
We balance a tree by making the height (distance from a node to the root) of any leaf on the tree as similar as possible.
1010

1111
.From unbalanced BST to balanced BST
@@ -27,18 +27,18 @@ To be more specific we rotated node `1` to the left to balance the tree.
2727
Let's examine all the possible rotation we can do to balance a tree.
2828

2929
== Tree Rotations
30-
30+
(((Tree Rotations)))
3131
We can do single rotations left and right and also we can do double rotations.
3232
Let's go one by one.
3333

3434
=== Single Right Rotation
3535

36-
Right rotation moves a node on the right as a child of other node.
36+
Right rotation moves a node on the right as a child of another node.
3737

3838
Take a look at the `@example` in the code below.
3939
As you can see we have an unbalanced tree `4-3-2-1`.
4040
We want to balance the tree, for that we need to do a right rotation of node 3.
41-
So, the node 3 is moved as the right child of the previous child.
41+
So, we move node 3 as the right child of the previous child.
4242

4343
.Single right rotation implementation
4444
[source, javascript]
@@ -54,7 +54,7 @@ include::{codedir}/data-structures/trees/tree-rotations.js[tag=rightRotation]
5454
The `swapParentChild` as it name says, swap the children.
5555
For our example, it swaps `node 4`'s left children from `node 3` to `node 2`.
5656

57-
Take a look at the implementation
57+
Take a look at the implementation.
5858

5959
.Swap Parent and Child Implementation
6060
[source, javascript]
@@ -73,7 +73,7 @@ After `swapParentChild`, we have the following:
7373

7474
Still not quite what we want.
7575
So, `newParent.setRightAndUpdateParent(node)` will make `node 3` the right child of `node 2`.
76-
Finally, we remove left child of `node 3` to be `null`.
76+
Finally, we remove the left child of `node 3` to be `null`.
7777

7878
----
7979
4
@@ -109,7 +109,7 @@ If you are curious about the `setRightAndUpdateParent` and `setLeftAndUpdatePare
109109
include::{codedir}/data-structures/trees/binary-tree-node.js[tag=setAndUpdateParent]
110110
----
111111

112-
You can also checkout the full
112+
You can also check out the full
113113
https://github.com/amejiarosario/dsa.js/blob/adfd8a660bbe0a7068fd7881aff9f51bdb9f92ae/src/data-structures/trees/binary-tree-node.js#L9[binary tree node implementation].
114114

115115
=== Left Right Rotation
@@ -122,7 +122,7 @@ This time are we going to do a double rotation.
122122
include::{codedir}/data-structures/trees/tree-rotations.js[tag=leftRightRotation]
123123
----
124124

125-
As you can see we do a left and then a right rotation. This is also called `LR rotation`
125+
As you can see we do a left and then a right rotation. This rotation is also known as `LR rotation`
126126

127127
=== Right Left Rotation
128128

@@ -134,7 +134,7 @@ Very similar to `leftRightRotation`. The difference is that we rotate right and
134134
include::{codedir}/data-structures/trees/tree-rotations.js[tag=rightLeftRotation]
135135
----
136136

137-
This rotation is also refered as `RL rotation`.
137+
This rotation is also referred to as `RL rotation`.
138138

139139
== Self-balancing trees implementations
140140

‎book/chapters/tree.adoc

Lines changed: 13 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,6 @@
11
= Tree
2-
2+
(((Tree)))
3+
(((Data Structures, Non-Linear, Tree)))
34
A tree is a non-linear data structure where a node can have zero or more connections. The topmost node in a tree is called *root*. The linked nodes to the root are called *children* or *descendants*.
45

56
.Tree Data Structure: root node and descendants.
@@ -50,13 +51,14 @@ image:image31.jpg[image]
5051
There are different kinds of trees depending on the restrictions. E.g. The trees that have two children or less are called *binary tree*, while trees with at most three children are called *Ternary Tree*. Since binary trees are most common we are going to cover them here and others in another chapter.
5152

5253
=== Binary Tree
53-
54+
(((Binary Tree)))
55+
(((Data Structures, Non-Linear, Binary Tree)))
5456
The binary restricts the nodes to have at most two children. Trees, in general, can have 3, 4, 23 or more, but not binary trees.
5557

5658
.Binary tree has at most 2 children while non-binary trees can have more.
5759
image:image32.png[image,width=321,height=193]
5860

59-
Binary trees are the one of the most common types and it's used to build other data structures and applications.
61+
Binary trees are one of the most used kinds of tree, and they are used to build other data structures.
6062

6163
.Binary Tree Applications
6264
- <<Map>>
@@ -66,7 +68,8 @@ Binary trees are the one of the most common types and it's used to build other d
6668

6769

6870
=== Binary Search Tree (BST)
69-
71+
(((Binary Search Tree)))
72+
(((Data Structures, Non-Linear, Binary Search Tree)))
7073
The Binary Search Tree (BST) is a specialization of the binary tree. BST has the same restriction as a binary tree; each node has at most two children. However, there’s another restriction: the values are ordered. It means the left child’s value has to be less or equal than the parent. In turn, the right child’s value has to be bigger than the parent.
7174

7275
> BST: left ≤ parent < right
@@ -76,16 +79,19 @@ image:image33.png[image,width=348,height=189]
7679

7780

7881
=== Binary Heap
79-
82+
(((Binary Heap)))
83+
(((Heap)))
84+
(((Max-Heap)))
85+
(((Min-Heap)))
86+
(((Data Structures, Non-Linear, Binary Heap)))
8087
The heap (max-heap) is a type of binary tree where the children's values are higher than the parent. Opposed to the BST, the left child doesn’t have to be smaller than the right child.
8188

8289
.Heap vs BST
8390
image:image34.png[image,width=325,height=176]
8491

85-
8692
The (max) heap has the maximum value in the root, while BST doesn’t.
8793

88-
There is two kind of heaps: min-heap and max-heap.
94+
There are two kinds of heaps: min-heap and max-heap.
8995
For a *max-heap*, the root has the highest value. The heap guarantee that as you move away from the root, the values get smaller. The opposite is true for a *min-heap*. In a min-heap, the lowest value is at the root, and as you go down the lower to the descendants, they will keep increasing values.
9096

9197
.Max-heap keeps the highest value at the top while min-heap keep the lowest at the root.

‎src/data-structures/trees/avl-tree.js

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -39,13 +39,13 @@ function balance(node) {
3939
}
4040
// end::balance[]
4141

42-
// tag::balanceUptream[]
42+
// tag::balanceUpstream[]
4343
/**
4444
* Bubbles up balancing nodes a their parents
4545
*
4646
* @param {TreeNode} node
4747
*/
48-
function balanceUptream(node) {
48+
function balanceUpstream(node) {
4949
let current = node;
5050
let newParent;
5151
while (current) {
@@ -54,7 +54,7 @@ function balanceUptream(node) {
5454
}
5555
return newParent;
5656
}
57-
// end::balanceUptream[]
57+
// end::balanceUpstream[]
5858

5959
// tag::AvlTree[]
6060
/**
@@ -68,7 +68,7 @@ class AvlTree extends BinarySearchTree {
6868
*/
6969
add(value) {
7070
const node = super.add(value);
71-
this.root = balanceUptream(node);
71+
this.root = balanceUpstream(node);
7272
return node;
7373
}
7474

@@ -80,7 +80,7 @@ class AvlTree extends BinarySearchTree {
8080
const node = super.find(value);
8181
if (node) {
8282
const found = super.remove(value);
83-
this.root = balanceUptream(node.parent);
83+
this.root = balanceUpstream(node.parent);
8484
return found;
8585
}
8686

0 commit comments

Comments
 (0)
Please sign in to comment.