Skip to content

Commit 618daba

Browse files
committedJun 23, 2019
fix images for github

35 files changed

+181
-109
lines changed
 

‎Find-the-largest-sum.png

15.9 KB
Loading
17.4 KB
Loading

‎Words-Permutations.png

46.2 KB
Loading

‎book/content/part01/algorithms-analysis.asc

Lines changed: 7 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,16 @@
1+
ifndef::imagesdir[]
2+
:imagesdir: ../../
3+
:codedir: ../../../src
4+
endif::[]
5+
16
=== Fundamentals of Algorithms Analysis
27

38
Probably you are reading this book because you want to write better and faster code.
49
How can you do that? Can you time how long it takes to run a program? Of course, you can!
510
[big]#⏱#
611
However, if you run the same program on a smartwatch, cellphone or desktop computer, it will take different times.
712

8-
image:images/image3.png[image,width=528,height=137]
13+
image:image3.png[image,width=528,height=137]
914

1015
Wouldn't it be great if we can compare algorithms regardless of the hardware where we run them?
1116
That's what *time complexity* is for!
@@ -88,7 +93,7 @@ Time complexity, in computer science, is a function that describes the number of
8893
How do you get a function that gives you the number of operations that will be executed? Well, we count line by line and mind code inside loops. Let's do an example to explain this point. For instance, we have a function to find the minimum value on an array called `getMin`.
8994

9095
.Translating lines of code to an approximate number of operations
91-
image:images/image4.png[Operations per line]
96+
image:image4.png[Operations per line]
9297

9398
Assuming that each line of code is an operation, we get the following:
9499

‎book/content/part01/big-o-examples.asc

Lines changed: 16 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,8 @@
1+
ifndef::imagesdir[]
2+
:imagesdir: ../../
3+
:codedir: ../../../src
4+
endif::[]
5+
16
=== Big O examples
27

38
There are many kinds of algorithms. Most of them fall into one of the eight of the time complexities that we are going to explore in this chapter.
@@ -17,7 +22,7 @@ We a going to provide examples for each one of them.
1722
Before we dive in, here’s a plot with all of them.
1823

1924
.CPU operations vs. Algorithm runtime as the input size grows
20-
image:images/image5.png[CPU time needed vs. Algorithm runtime as the input size increases]
25+
image:image5.png[CPU time needed vs. Algorithm runtime as the input size increases]
2126

2227
The above chart shows how the running time of an algorithm is related to the amount of work the CPU has to perform. As you can see O(1) and O(log n) are very scalable. However, O(n^2^) and worst can make your computer run for years [big]#😵# on large datasets. We are going to give some examples so you can identify each one.
2328

@@ -33,7 +38,7 @@ Represented as *O(1)*, it means that regardless of the input size the number of
3338
Let's implement a function that finds out if an array is empty or not.
3439

3540
//.is-empty.js
36-
//image:images/image6.png[image,width=528,height=401]
41+
//image:image6.png[image,width=528,height=401]
3742

3843
[source, javascript]
3944
----
@@ -56,7 +61,7 @@ indexterm:[Runtime, Logarithmic]
5661

5762
The binary search only works for sorted lists. It starts searching for an element on the middle of the array and then it moves to the right or left depending if the value you are looking for is bigger or smaller.
5863

59-
// image:images/image7.png[image,width=528,height=437]
64+
// image:image7.png[image,width=528,height=437]
6065

6166
[source, javascript]
6267
----
@@ -78,7 +83,7 @@ Linear algorithms are one of the most common runtimes. It’s represented as *O(
7883

7984
Let’s say that we want to find duplicate elements in an array. What’s the first implementation that comes to mind? Check out this implementation:
8085

81-
// image:images/image8.png[image,width=528,height=383]
86+
// image:image8.png[image,width=528,height=383]
8287

8388
[source, javascript]
8489
----
@@ -105,7 +110,7 @@ An algorithm with a linearithmic runtime is represented as _O(n log n)_. This on
105110

106111
The ((Merge Sort)), like its name indicates, has two functions merge and sort. Let’s start with the sort function:
107112

108-
// image:images/image9.png[image,width=528,height=383]
113+
// image:image9.png[image,width=528,height=383]
109114

110115
.Sort part of the mergeSort
111116
[source, javascript]
@@ -116,7 +121,7 @@ include::{codedir}/algorithms/sorting/merge-sort.js[tag=splitSort]
116121
<2> We divide the array into two halves.
117122
<3> Merge the two parts recursively with the `merge` function explained below
118123

119-
// image:images/image10.png[image,width=528,height=380]
124+
// image:image10.png[image,width=528,height=380]
120125

121126
.Merge part of the mergeSort
122127
[source, javascript]
@@ -127,7 +132,7 @@ include::{codedir}/algorithms/sorting/merge-sort.js[tag=merge]
127132
The merge function combines two sorted arrays in ascending order. Let’s say that we want to sort the array `[9, 2, 5, 1, 7, 6]`. In the following illustration, you can see what each function does.
128133

129134
.Mergesort visualization. Shows the split, sort and merge steps
130-
image:images/image11.png[Mergesort visualization,width=500,height=600]
135+
image:image11.png[Mergesort visualization,width=500,height=600]
131136

132137
How do we obtain the running time of the merge sort algorithm? The mergesort divides the array in half each time in the split phase, _log n_, and the merge function join each splits, _n_. The total work we have *O(n log n)*. There more formal ways to reach to this runtime like using the https://adrianmejia.com/blog/2018/04/24/analysis-of-recursive-algorithms/[Master Method] and https://www.cs.cornell.edu/courses/cs3110/2012sp/lectures/lec20-master/lec20.html[recursion trees].
133138

@@ -144,7 +149,7 @@ Usually, they have double-nested loops that where each one visits all or most el
144149

145150
If you remember we have solved this problem more efficiently on the <<part01-algorithms-analysis#linear, Linear>> section. We solved this problem before using an _O(n)_, let’s solve it this time with an _O(n^2^)_:
146151

147-
// image:images/image12.png[image,width=527,height=389]
152+
// image:image12.png[image,width=527,height=389]
148153

149154
.Naïve implementation of has duplicates function
150155
[source, javascript]
@@ -171,7 +176,7 @@ _3x + 9y + 8z = 79_
171176

172177
A naïve approach to solve this will be the following program:
173178

174-
//image:images/image13.png[image,width=528,height=448]
179+
//image:image13.png[image,width=528,height=448]
175180

176181
.Naïve implementation of multi-variable equation solver
177182
[source, javascript]
@@ -196,7 +201,7 @@ Let’s do an example.
196201

197202
Finding all distinct subsets of a given set can be implemented as follows:
198203

199-
// image:images/image14.png[image,width=528,height=401]
204+
// image:image14.png[image,width=528,height=401]
200205

201206
.Subsets in a Set
202207
[source, javascript]
@@ -238,7 +243,7 @@ A factorial is the multiplication of all the numbers less than itself down to 1.
238243
One classic example of an _O(n!)_ algorithm is finding all the different words that can be formed with a given set of letters.
239244

240245
.Word's permutations
241-
// image:images/image15.png[image,width=528,height=377]
246+
// image:image15.png[image,width=528,height=377]
242247
[source, javascript]
243248
----
244249
include::{codedir}/runtimes/08-permutations.js[tag=snippet]

‎book/content/part02/array-vs-list-vs-queue-vs-stack.asc

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,8 @@
1+
ifndef::imagesdir[]
2+
:imagesdir: ../../
3+
:codedir: ../../../src
4+
endif::[]
5+
16
=== Array vs. Linked List & Queue vs. Stack
27

38
In this part of the book, we explored the most used linear data structures such as Arrays, Linked Lists, Stacks and Queues. We implemented them and discussed the runtime of their operations.

‎book/content/part02/array.asc

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
1-
// ifndef::imagesdir[]
2-
// :imagesdir: ../images
3-
// :codedir: ../../src
4-
// endif::[]
1+
ifndef::imagesdir[]
2+
:imagesdir: ../../
3+
:codedir: ../../../src
4+
endif::[]
55

66
[[array]]
77
=== Array
@@ -23,7 +23,7 @@ Some programming languages have fixed size arrays like Java and C++. Fixed size
2323
Arrays look like this:
2424

2525
.Array representation: each value is accessed through an index.
26-
image:images/image16.png[image,width=388,height=110]
26+
image:image16.png[image,width=388,height=110]
2727

2828
Arrays are a sequential collection of elements that can be accessed randomly using an index. Let’s take a look into the different operations that we can do with arrays.
2929

‎book/content/part02/linked-list.asc

Lines changed: 13 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,8 @@
1-
// ifndef::imagesdir[]
2-
// :imagesdir: ../images
3-
// :codedir: ../../src
4-
// endif::[]
1+
ifndef::imagesdir[]
2+
:imagesdir: ../../
3+
:codedir: ../../../src
4+
endif::[]
5+
56
[[linked-list]]
67
=== Linked List
78
(((Linked List)))
@@ -20,7 +21,7 @@ A list (or Linked List) is a linear data structure where each node is "linked" t
2021
Each element or node is *connected* to the next one by a reference. When a node only has one connection it's called *singly linked list*:
2122

2223
.Singly Linked List Representation: each node has a reference (blue arrow) to the next one.
23-
image:images/image19.png[image,width=498,height=97]
24+
image:image19.png[image,width=498,height=97]
2425

2526
Usually, a Linked List is referenced by the first element in called *head* (or *root* node). For instance, if you want to get the `cat` element from the example above, then the only way to get there is using the `next` field on the head node. You would get `art` first, then use the next field recursively until you eventually get the `cat` element.
2627

@@ -30,7 +31,7 @@ Usually, a Linked List is referenced by the first element in called *head* (or *
3031
When each node has a connection to the `next` item and also the `previous` one, then we have a *doubly linked list*.
3132

3233
.Doubly Linked List: each node has a reference to the next and previous element.
33-
image:images/image20.png[image,width=528,height=74]
34+
image:image20.png[image,width=528,height=74]
3435

3536
With a doubly list you can not only move forward but also backward. If you keep the reference to the last element (`cat`) you can step back and reach the middle part.
3637

@@ -120,7 +121,7 @@ Similar to the array, with a linked list you can add elements at the beginning,
120121
We are going to use the `Node` class to create a new element and stick it at the beginning of the list as shown below.
121122

122123
.Insert at the beginning by linking the new node with the current first node.
123-
image:images/image23.png[image,width=498,height=217]
124+
image:image23.png[image,width=498,height=217]
124125

125126

126127
To insert at the beginning, we create a new node with the next reference to the current first node. Then we make first the new node. In code, it would look something like this:
@@ -139,7 +140,7 @@ As you can see, we create a new node and make it the first one.
139140
Appending an element at the end of the list can be done very effectively if we have a pointer to the `last` item in the list. Otherwise, you would have to iterate through the whole list.
140141

141142
.Add element to the end of the linked list
142-
image:images/image24.png[image,width=498,height=208]
143+
image:image24.png[image,width=498,height=208]
143144

144145
.Linked List's add to the end of the list implementation
145146
[source, javascript]
@@ -170,7 +171,7 @@ art <-> dog <-> cat
170171
We want to insert the `new` node in the 2^nd^ position. For that we first create the "new" node and update the references around it.
171172

172173
.Inserting node in the middle of a doubly linked list.
173-
image:images/image25.png[image,width=528,height=358]
174+
image:image25.png[image,width=528,height=358]
174175

175176
Take a look into the implementation of https://github.com/amejiarosario/dsa.js/blob/master/src/data-structures/linked-lists/linked-list.js#L83[LinkedList.add]:
176177

@@ -198,7 +199,7 @@ Deleting is an interesting one. We don’t delete an element; we remove all refe
198199
Deleting the first element (or head) is a matter of removing all references to it.
199200

200201
.Deleting an element from the head of the list
201-
image:images/image26.png[image,width=528,height=74]
202+
image:image26.png[image,width=528,height=74]
202203

203204
For instance, to remove the head (“art”) node, we change the variable `first` to point to the second node “dog”. We also remove the variable `previous` from the "dog" node, so it doesn't point to the “art” node. The garbage collector will get rid of the “art” node when it seems nothing is using it anymore.
204205

@@ -215,7 +216,7 @@ As you can see, when we want to remove the first node we make the 2nd element th
215216
Removing the last element from the list would require to iterate from the head until we find the last one, that’s O(n). But, If we have a reference to the last element, which we do, We can do it in _O(1)_ instead!
216217

217218
.Removing last element from the list using the last reference.
218-
image:images/image27.png[image,width=528,height=221]
219+
image:image27.png[image,width=528,height=221]
219220

220221

221222
For instance, if we want to remove the last node “cat”. We use the last pointer to avoid iterating through the whole list. We check `last.previous` to get the “dog” node and make it the new `last` and remove its next reference to “cat”. Since nothing is pointing to “cat” then is out of the list and eventually is deleted from memory by the garbage collector.
@@ -234,7 +235,7 @@ The code is very similar to `removeFirst`, but instead of first we update `last`
234235
To remove a node from the middle, we make the surrounding nodes to bypass the one we want to delete.
235236

236237
.Remove the middle node
237-
image:images/image28.png[image,width=528,height=259]
238+
image:image28.png[image,width=528,height=259]
238239

239240

240241
In the illustration, we are removing the middle node “dog” by making art’s `next` variable to point to cat and cat’s `previous` to be “art” totally bypassing “dog”.

‎book/content/part02/queue.asc

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
1-
// ifndef::imagesdir[]
2-
// :imagesdir: ../images
3-
// :codedir: ../../src
4-
// endif::[]
1+
ifndef::imagesdir[]
2+
:imagesdir: ../../
3+
:codedir: ../../../src
4+
endif::[]
55

66
[[queue]]
77
=== Queue
@@ -12,7 +12,7 @@
1212
A queue is a linear data structure where the data flows in a *First-In-First-Out* (FIFO) manner.
1313

1414
.Queue data structure is like a line of people: the First-in, is the First-out
15-
image:images/image30.png[image,width=528,height=171]
15+
image:image30.png[image,width=528,height=171]
1616

1717
A queue is like a line of people at the bank; the person that arrived first is the first to go out as well.
1818

‎book/content/part02/stack.asc

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
1-
// ifndef::imagesdir[]
2-
// :imagesdir: ../images
3-
// :codedir: ../../src
4-
// endif::[]
1+
ifndef::imagesdir[]
2+
:imagesdir: ../../
3+
:codedir: ../../../src
4+
endif::[]
55

66
[[stack]]
77
=== Stack
@@ -14,7 +14,7 @@ The stack is a data structure that restricts the way you add and remove data. It
1414
An analogy is to think the stack is a rod and the data are discs. You can only take out the last one you put in.
1515

1616
.Stack data structure is like a stack of disks: the last element in is the first element out
17-
image:images/image29.png[image,width=240,height=238]
17+
image:image29.png[image,width=240,height=238]
1818

1919
// #Change image from https://www.khanacademy.org/computing/computer-science/algorithms/towers-of-hanoi/a/towers-of-hanoi[Khan Academy]#
2020

‎book/content/part03/binary-search-tree-traversal.asc

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
1-
// ifndef::imagesdir[]
2-
// :imagesdir: ../images
3-
// :codedir: ../../src
4-
// endif::[]
1+
ifndef::imagesdir[]
2+
:imagesdir: ../../
3+
:codedir: ../../../src
4+
endif::[]
55

66
=== Binary Tree Traversal
77
(((Binary Tree Traversal)))

‎book/content/part03/binary-search-tree.asc

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
1-
// ifndef::imagesdir[]
2-
// :imagesdir: ../images
3-
// :codedir: ../../src
4-
// endif::[]
1+
ifndef::imagesdir[]
2+
:imagesdir: ../../
3+
:codedir: ../../../src
4+
endif::[]
55

66
=== Binary Search Tree
77
(((Binary Search Tree)))
@@ -57,7 +57,7 @@ With the methods `add` and `remove` we have to guarantee that our tree always ha
5757
For instance, let’s say that we want to insert the values 19, 21, 10, 2, 8 in a BST:
5858

5959
.Inserting values on a BST.
60-
image:images/image36.png[image,width=528,height=329]
60+
image:image36.png[image,width=528,height=329]
6161

6262
In the last box of the image above, when we are inserting node 18, we start by the root (19). Since 18 is less than 19, then we move left. Node 18 is greater than 10, so we move right. There’s an empty spot, and we place it there. Let’s code it up:
6363

@@ -95,7 +95,7 @@ Deleting a node from a BST have three cases.
9595
Deleting a leaf is the easiest; we look for their parent and set the child to null.
9696

9797
.Removing node without children from a BST.
98-
image:images/image37.png[image,width=528,height=200]
98+
image:image37.png[image,width=528,height=200]
9999

100100

101101
Node 18, will be hanging around until the garbage collector is run. However, there’s no node referencing to it so it won’t be reachable from the tree anymore.
@@ -105,7 +105,7 @@ Node 18, will be hanging around until the garbage collector is run. However, the
105105
Removing a parent is not as easy since you need to find new parents for its children.
106106

107107
.Removing node with 1 children from a BST.
108-
image:images/image38.png[image,width=528,height=192]
108+
image:image38.png[image,width=528,height=192]
109109

110110

111111
In the example, we removed node `10` from the tree, so its child (node 2) needs a new parent. We made node 19 the new parent for node 2.
@@ -115,7 +115,7 @@ In the example, we removed node `10` from the tree, so its child (node 2) needs
115115
Removing a parent of two children is the trickiest of all cases because we need to find new parents for two children. (This sentence sounds tragic out of context 😂)
116116

117117
.Removing node with two children from a BST.
118-
image:images/image39.png[image,width=528,height=404]
118+
image:image39.png[image,width=528,height=404]
119119

120120

121121
In the example, we delete the root node 19. This deletion leaves two orphans (node 10 and node 21). There are no more parents because node 19 was the *root* element. One way to solve this problem is to *combine* the left subtree (Node 10 and descendants) into the right subtree (node 21). The final result is node 21 is the new root.
@@ -163,7 +163,7 @@ That’s all we need to remove elements from a BST. Check out the complete BST i
163163
As we insert and remove nodes from a BST we could end up like the tree on the left:
164164

165165
.Balanced vs. Unbalanced Tree.
166-
image:images/image40.png[image,width=454,height=201]
166+
image:image40.png[image,width=454,height=201]
167167

168168
The tree on the left is unbalanced. It looks like a Linked List and has the same runtime! Searching for an element would be *O(n)*, yikes! However, on a balanced tree, the search time is *O(log n)*, which is pretty good! That’s why we always want to keep the tree balanced. In further chapters, we are going to explore how to keep a tree balanced after each insert/delete.
169169

‎book/content/part03/graph-search.asc

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
1-
// ifndef::imagesdir[]
2-
// :imagesdir: ../images
3-
// :codedir: ../../src
4-
// endif::[]
1+
ifndef::imagesdir[]
2+
:imagesdir: ../../
3+
:codedir: ../../../src
4+
endif::[]
55

66
=== Graph Search
77

@@ -11,7 +11,7 @@ WARNING: Graph search is very similar to <<Tree Search & Traversal>>. So, if yo
1111

1212
There are two ways to navigate the graph, one is using Depth-First Search (DFS) and the other one is Breadth-First Search (BFS). Let's see the difference using the following graph.
1313

14-
image::images/directed-graph.png[directed graph]
14+
image::directed-graph.png[directed graph]
1515

1616
// [graphviz, directed graph, png]
1717
// ....

‎book/content/part03/graph.asc

Lines changed: 11 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
1-
// ifndef::imagesdir[]
2-
// :imagesdir: ../images
3-
// :codedir: ../../src
4-
// endif::[]
1+
ifndef::imagesdir[]
2+
:imagesdir: ../../
3+
:codedir: ../../../src
4+
endif::[]
55

66
[[graph]]
77
=== Graph
@@ -33,14 +33,14 @@ The connection between two nodes is called *edge*.
3333
Also, nodes might be called *vertex*.
3434

3535
.Graph is composed of vertices/nodes and edges
36-
image:images/image42.png[image,width=305,height=233]
36+
image:image42.png[image,width=305,height=233]
3737

3838
===== Directed Graph vs Undirected
3939

4040
A graph can be either *directed* or *undirected*.
4141

4242
.Graph: directed vs undirected
43-
image:images/image43.jpg[image,width=469,height=192]
43+
image:image43.jpg[image,width=469,height=192]
4444

4545

4646
An *undirected graph* has edges that are *two-way street*. E.g., On the undirected example, you can traverse from the green node to the orange and vice versa.
@@ -52,7 +52,7 @@ A *directed graph (digraph)* has edges that are *one-way street*. E.g., On the d
5252
A graph can have *cycles* or not.
5353

5454
.Cyclic vs Acyclic Graphs.
55-
image:images/image44.jpg[image,width=444,height=194]
55+
image:image44.jpg[image,width=444,height=194]
5656

5757
(((Cyclic Graph)))
5858
A *cyclic graph* is the one that you can pass through a node more than once.
@@ -68,7 +68,7 @@ The *Directed Acyclic Graph (DAG)* is unique. It has many applications like sche
6868
===== Connected vs Disconnected vs Complete Graphs
6969

7070
.Different kinds of graphs: disconnected, connected, and complete.
71-
image:images/image45.png[image,width=1528,height=300]
71+
image:image45.png[image,width=1528,height=300]
7272

7373
A *disconnected graph* is one that has one or more subgraph. In other words, a graph is *disconnected* if two nodes don’t have a path between them.
7474

@@ -81,7 +81,7 @@ A *complete graph* is where every node is adjacent to all the other nodes in the
8181
Weighted graphs have labels in the edges (a.k.a *weight* or *cost*). The link weight can represent many things like distance, travel time, or anything else.
8282

8383
.Weighted Graph representing USA airports distance in miles.
84-
image:images/image46.png[image,width=528,height=337]
84+
image:image46.png[image,width=528,height=337]
8585

8686
For instance, a weighted graph can have a distance between nodes. So, algorithms can use the weight and optimize the path between them.
8787

@@ -120,7 +120,7 @@ There are two main ways to graphs one is:
120120
Representing graphs as adjacency matrix is done using a two-dimensional array. For instance, let’s say we have the following graph:
121121

122122
.Graph and its adjacency matrix.
123-
image:images/image47.png[image,width=438,height=253]
123+
image:image47.png[image,width=438,height=253]
124124

125125
The number of vertices |V| define the size of the matrix. In the example, we have five vertices, so we have a 5x5 matrix.
126126

@@ -167,7 +167,7 @@ The space complexity of the adjacency matrix is *O(|V|^2^)*, where |V| is the nu
167167
Another way to represent a graph is by using an adjacency list. This time instead of using an array (matrix) we use a list.
168168

169169
.Graph represented as an Adjacency List.
170-
image:images/image48.png[image,width=528,height=237]
170+
image:image48.png[image,width=528,height=237]
171171

172172
If we want to add a new node to the list, we can do it by adding one element to the end of the array of nodes *O(1)*. In the next section, we are going to explore the running times of all operations in an adjacency list.
173173

‎book/content/part03/hashmap.asc

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
1-
// ifndef::imagesdir[]
2-
// :imagesdir: ../images
3-
// :codedir: ../../src
4-
// endif::[]
1+
ifndef::imagesdir[]
2+
:imagesdir: ../../
3+
:codedir: ../../../src
4+
endif::[]
55

66
[[hashmap]]
77
==== HashMap
@@ -24,7 +24,7 @@ How are the keys mapped to their values?
2424
Using a hash function. Here’s an illustration:
2525

2626
.Internal HashMap representation
27-
image:images/image41.png[image,width=528,height=299]
27+
image:image41.png[image,width=528,height=299]
2828

2929

3030
.This is the main idea:

‎book/content/part03/map.asc

Lines changed: 5 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,8 @@
1-
// ifndef::imagesdir[]
2-
// :imagesdir: ../images
3-
// :codedir: ../../src
4-
// endif::[]
1+
ifndef::imagesdir[]
2+
:imagesdir: ../../
3+
:codedir: ../../../src
4+
endif::[]
5+
56
[[map]]
67
=== Map
78
(((Map)))

‎book/content/part03/set.asc

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
1-
// ifndef::imagesdir[]
2-
// :imagesdir: ../images
3-
// :codedir: ../../src
4-
// endif::[]
1+
ifndef::imagesdir[]
2+
:imagesdir: ../../
3+
:codedir: ../../../src
4+
endif::[]
55

66
[[set]]
77
=== Set

‎book/content/part03/time-complexity-graph-data-structures.asc

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,8 @@
1+
ifndef::imagesdir[]
2+
:imagesdir: ../../
3+
:codedir: ../../../src
4+
endif::[]
5+
16
=== Summary
27

38
In this section, we learned about Graphs applications, properties and how we can create them. We mention that you can represent a graph as a matrix or as a list of adjacencies. We went for implementing the later since it's more space efficient. We cover the basic graph operations like adding and removing nodes and edges. In the algorithms section, we are going to cover searching values in the graph.

‎book/content/part03/tree-intro.asc

Lines changed: 10 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
1-
// ifndef::imagesdir[]
2-
// :imagesdir: ../images
3-
// :codedir: ../../src
4-
// endif::[]
1+
ifndef::imagesdir[]
2+
:imagesdir: ../../
3+
:codedir: ../../../src
4+
endif::[]
55

66
[[tree]]
77
=== Tree
@@ -10,7 +10,7 @@
1010
A tree is a non-linear data structure where a node can have zero or more connections. The topmost node in a tree is called *root*. The linked nodes to the root are called *children* or *descendants*.
1111

1212
.Tree Data Structure: root node and descendants.
13-
image:images/image31.jpg[image,width=404,height=240]
13+
image:image31.jpg[image,width=404,height=240]
1414

1515
As you can see in the picture above, this data structure resembles an inverted tree hence the name. It starts with a *root* node and *branch* off with its descendants, and finally *leaves*.
1616

@@ -50,7 +50,7 @@ Simple! Right? But there are some constraints that you have to keep at all times
5050
* The *depth of a tree* is the distance (edge count) from the root to the farthest leaf.
5151

5252
.Tree anatomy
53-
image:images/image31.jpg[image]
53+
image:image31.jpg[image]
5454

5555
==== Types of Binary Trees
5656

@@ -62,7 +62,7 @@ There are different kinds of trees depending on the restrictions. E.g. The trees
6262
The binary restricts the nodes to have at most two children. Trees, in general, can have 3, 4, 23 or more, but not binary trees.
6363

6464
.Binary tree has at most 2 children while non-binary trees can have more.
65-
image:images/image32.png[image,width=321,height=193]
65+
image:image32.png[image,width=321,height=193]
6666

6767
Binary trees are one of the most used kinds of tree, and they are used to build other data structures.
6868

@@ -81,7 +81,7 @@ The Binary Search Tree (BST) is a specialization of the binary tree. BST has the
8181
> BST: left ≤ parent < right
8282

8383
.BST or ordered binary tree vs. non-BST.
84-
image:images/image33.png[image,width=348,height=189]
84+
image:image33.png[image,width=348,height=189]
8585

8686

8787
===== Binary Heap
@@ -93,15 +93,15 @@ image:images/image33.png[image,width=348,height=189]
9393
The heap (max-heap) is a type of binary tree where the children's values are higher than the parent. Opposed to the BST, the left child doesn’t have to be smaller than the right child.
9494

9595
.Heap vs BST
96-
image:images/image34.png[image,width=325,height=176]
96+
image:image34.png[image,width=325,height=176]
9797

9898
The (max) heap has the maximum value in the root, while BST doesn’t.
9999

100100
There are two kinds of heaps: min-heap and max-heap.
101101
For a *max-heap*, the root has the highest value. The heap guarantee that as you move away from the root, the values get smaller. The opposite is true for a *min-heap*. In a min-heap, the lowest value is at the root, and as you go down the lower to the descendants, they will keep increasing values.
102102

103103
.Max-heap keeps the highest value at the top while min-heap keep the lowest at the root.
104-
image:images/image35.png[image,width=258,height=169]
104+
image:image35.png[image,width=258,height=169]
105105

106106

107107
.Heap vs. Binary Search Tree

‎book/content/part03/tree-search-traversal.asc

Lines changed: 6 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,8 @@
1+
ifndef::imagesdir[]
2+
:imagesdir: ../../
3+
:codedir: ../../../src
4+
endif::[]
5+
16
=== Tree Search & Traversal
27

38
So far we covered, how to insert/delete/search values in a binary search tree (BST).
@@ -105,7 +110,7 @@ We can also implement it as recursive functions are we are going to see in the <
105110
We can see visually the difference between how the DFS and BFS search for nodes:
106111

107112
.Depth-First Search vs. Breadth-First Search
108-
image:images/depth-first-search-dfs-breadth-first-search-bfs.jpg[]
113+
image:depth-first-search-dfs-breadth-first-search-bfs.jpg[]
109114

110115
As you can see the DFS in two iterations is already at one of the farthest nodes from the root while BFS search nearby nodes first.
111116

‎book/content/part03/treemap.asc

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
1-
// ifndef::imagesdir[]
2-
// :imagesdir: ../images
3-
// :codedir: ../../src
4-
// endif::[]
1+
ifndef::imagesdir[]
2+
:imagesdir: ../../
3+
:codedir: ../../../src
4+
endif::[]
55

66
[[treemap]]
77
==== TreeMap

‎book/content/part04/algorithmic-toolbox.asc

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,8 @@
1+
ifndef::imagesdir[]
2+
:imagesdir: ../../
3+
:codedir: ../../../src
4+
endif::[]
5+
16
[[algorithms-toolbox]]
27
=== Algorithmic Toolbox
38

‎book/content/part04/backtracking.asc

Lines changed: 6 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,8 @@
1+
ifndef::imagesdir[]
2+
:imagesdir: ../../
3+
:codedir: ../../../src
4+
endif::[]
5+
16
=== Backtracking
27
(((Backtracking)))
38
(((Algorithmic Techniques, Backtracking)))
@@ -10,7 +15,7 @@ it stops and steps back (backtracks) to try another alternative.
1015
Some examples that use backtracking is a solving Sudoku/crosswords puzzle, and graph operations.
1116

1217
ifndef::backend-pdf[]
13-
image:images/Sudoku_solved_by_bactracking.gif[]
18+
image:Sudoku_solved_by_bactracking.gif[]
1419
endif::backend-pdf[]
1520

1621
Listing all possible solutions might sound like a brute force.

‎book/content/part04/bubble-sort.asc

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
1-
// ifndef::imagesdir[]
2-
// :imagesdir: ../images
3-
// :codedir: ../../src
4-
// endif::[]
1+
ifndef::imagesdir[]
2+
:imagesdir: ../../
3+
:codedir: ../../../src
4+
endif::[]
55

66
==== Bubble Sort
77
(((Bubble Sort)))

‎book/content/part04/divide-and-conquer.asc

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,8 @@
1+
ifndef::imagesdir[]
2+
:imagesdir: ../../
3+
:codedir: ../../../src
4+
endif::[]
5+
16
=== Divide and Conquer
27

38
(((Divide and Conquer)))

‎book/content/part04/dynamic-programming.asc

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,7 @@
1+
ifndef::imagesdir[]
2+
:imagesdir: ../../
3+
:codedir: ../../../src
4+
endif::[]
15

26
=== Dynamic Programming
37

‎book/content/part04/greedy-algorithms.asc

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,8 @@
1+
ifndef::imagesdir[]
2+
:imagesdir: ../../
3+
:codedir: ../../../src
4+
endif::[]
5+
16
=== Greedy Algorithms
27

38
(((Greedy Algorithms)))

‎book/content/part04/insertion-sort.asc

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,8 @@
1+
ifndef::imagesdir[]
2+
:imagesdir: ../../
3+
:codedir: ../../../src
4+
endif::[]
5+
16
[[insertion-sort]]
27
==== Insertion Sort
38

‎book/content/part04/merge-sort.asc

Lines changed: 6 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,8 @@
1+
ifndef::imagesdir[]
2+
:imagesdir: ../../
3+
:codedir: ../../../src
4+
endif::[]
5+
16
[[merge-sort]]
27
==== Merge Sort
38

@@ -9,7 +14,7 @@ Merge Sort is an efficient sorting algorithm that uses <<Divide and Conquer, div
914
indexterm:[Divide and Conquer]
1015
Merge sort algorithm splits the array into halves until 2 or fewer elements are left. It sorts these two elements and then merges back all halves until the whole collection is sorted.
1116

12-
image:images/image11.png[Mergesort visualization,width=500,height=600]
17+
image:image11.png[Mergesort visualization,width=500,height=600]
1318

1419
===== Merge Sort Implementation
1520

‎book/content/part04/quick-sort.asc

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
1-
// ifndef::imagesdir[]
2-
// :imagesdir: ../images
3-
// :codedir: ../../src
4-
// endif::[]
1+
ifndef::imagesdir[]
2+
:imagesdir: ../../
3+
:codedir: ../../../src
4+
endif::[]
55

66
[[quicksort]]
77
==== Quicksort

‎book/content/part04/selection-sort.asc

Lines changed: 6 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,8 @@
1+
ifndef::imagesdir[]
2+
:imagesdir: ../../
3+
:codedir: ../../../src
4+
endif::[]
5+
16
[[selection-sort]]
27
==== Selection Sort
38

@@ -10,7 +15,7 @@ The selection sort is a simple sorting algorithm. As its name indicates, it _sel
1015
. Find the minimum item in the rest of the array. If a new minimum is found swap them.
1116
. Repeat step #1 and #2 with the next element until the last one.
1217

13-
image:images/selection-sort.gif[]
18+
image:selection-sort.gif[]
1419

1520
===== Selection sort implementation
1621
For implementing the selection sort, we need two indexes.

‎book/content/part04/sorting-algorithms.asc

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,8 @@
1+
ifndef::imagesdir[]
2+
:imagesdir: ../../
3+
:codedir: ../../../src
4+
endif::[]
5+
16
=== Sorting Algorithms
27

38
Sorting is one of the most common solutions when we want to extract some insights about a collection of data.

‎book/dsajs.asc

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,12 +13,13 @@ Adrian Mejia
1313
:toc:
1414
:toclevels: 2
1515
:pagenums:
16-
:front-cover-image: image:content/cover.png[width=1050,height=1600]
16+
:front-cover-image: image:cover.png[width=1050,height=1600]
1717
:icons: font
1818
//
1919
// custom variables (no blank lines)
2020
//
2121
:codedir: ../../../src
22+
:imagesdir: images
2223
//
2324
// highlighter
2425
:source-highlighter: rouge

‎book/images/cover.png

9.5 KB
Loading

‎recursive-fibonacci-call-tree.png

51.5 KB
Loading

0 commit comments

Comments
 (0)
Please sign in to comment.