Skip to content

Commit a36179f

Browse files
authoredMar 23, 2020
Merge pull request amejiarosario#45 from amejiarosario/v-tank/fix-suggestions
v-tank/fix suggestions

File tree

12 files changed

+66
-66
lines changed

12 files changed

+66
-66
lines changed
 

‎book/content/dedication.asc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
11
[dedication]
22
== Dedication
33

4-
_To my wife Nathalie that supported me in my long hours of writing and my baby girl Abigail._
4+
_To my wife Nathalie who supported me in my long hours of writing and my baby girl Abigail._

‎book/content/part01/algorithms-analysis.asc

Lines changed: 10 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ endif::[]
55

66
=== Fundamentals of Algorithms Analysis
77

8-
Probably you are reading this book because you want to write better and faster code.
8+
You are probably reading this book because you want to write better and faster code.
99
How can you do that? Can you time how long it takes to run a program? Of course, you can!
1010
[big]#⏱#
1111
However, if you run the same program on a smartwatch, cellphone or desktop computer, it will take different times.
@@ -15,7 +15,7 @@ image::image3.png[image,width=528,height=137]
1515
Wouldn't it be great if we can compare algorithms regardless of the hardware where we run them?
1616
That's what *time complexity* is for!
1717
But, why stop with the running time?
18-
We could also compare the memory "used" by different algorithms, and we called that *space complexity*.
18+
We could also compare the memory "used" by different algorithms, and we call that *space complexity*.
1919

2020
.In this chapter you will learn:
2121
- What’s the best way to measure the performance of your code regardless of what hardware you use.
@@ -59,16 +59,16 @@ To give you a clearer picture of how different algorithms perform as the input s
5959
|=============================================================================================
6060
|Input size -> |10 |100 |10k |100k |1M
6161
|Finding if a number is odd |< 1 sec. |< 1 sec. |< 1 sec. |< 1 sec. |< 1 sec.
62-
|Sorting elements in array with merge sort |< 1 sec. |< 1 sec. |< 1 sec. |few sec. |20 sec.
63-
|Sorting elements in array with Bubble Sort |< 1 sec. |< 1 sec. |2 minutes |3 hours |12 days
64-
|Finding all subsets of a given set |< 1 sec. |40,170 trillion years |> centillion years |∞ |∞
65-
|Find all permutations of a string |4 sec. |> vigintillion years |> centillion years |∞ |∞
62+
|Sorting array with merge sort |< 1 sec. |< 1 sec. |< 1 sec. |few sec. |20 sec.
63+
|Sorting array with Selection Sort |< 1 sec. |< 1 sec. |2 minutes |3 hours |12 days
64+
|Finding all subsets |< 1 sec. |40,170 trillion years |> centillion years |∞ |∞
65+
|Finding string permutations |4 sec. |> vigintillion years |> centillion years |∞ |∞
6666
|=============================================================================================
6767

6868
Most algorithms are affected by the size of the input (`n`). Let's say you need to arrange numbers in ascending order. Sorting ten items will naturally take less time than sorting out 2 million. But, how much longer? As the input size grow, some algorithms take proportionally more time, we classify them as <<part01-algorithms-analysis#linear, linear>> runtime [or `O(n)`]. Others might take power two longer; we call them <<part01-algorithms-analysis#quadratic, quadratic>> running time [or `O(n^2^)`].
6969

7070
From another perspective, if you keep the input size the same and run different algorithms implementations, you would notice the difference between an efficient algorithm and a slow one. For example, a good sorting algorithm is <<part04-algorithmic-toolbox#merge-sort>>, and an inefficient algorithm for large inputs is <<part04-algorithmic-toolbox#selection-sort>>.
71-
Organizing 1 million elements with merge sort takes 20 seconds while bubble sort takes 12 days, ouch!
71+
Organizing 1 million elements with merge sort takes 20 seconds while selection sort takes 12 days, ouch!
7272
The amazing thing is that both programs are solving the same problem with equal data and hardware; and yet, there's a big difference in time!
7373

7474
After completing this book, you are going to _think algorithmically_.
@@ -135,7 +135,7 @@ There’s a notation called *Big O*, where `O` refers to the *order of the funct
135135

136136
TIP: Big O = Big Order of a function.
137137

138-
If you have a program which runtime is:
138+
If you have a program that has a runtime of:
139139

140140
_7n^3^ + 3n^2^ + 5_
141141

@@ -144,7 +144,7 @@ You can express it in Big O notation as _O(n^3^)_. The other terms (_3n^2^ + 5_)
144144
Big O notation, only cares about the “biggest” terms in the time/space complexity. So, it combines what we learn about time and space complexity, asymptotic analysis and adds a worst-case scenario.
145145

146146
.All algorithms have three scenarios:
147-
* Best-case scenario: the most favorable input arrange where the program will take the least amount of operations to complete. E.g., array already sorted is beneficial for some sorting algorithms.
147+
* Best-case scenario: the most favorable input arrangement where the program will take the least amount of operations to complete. E.g., an array that's already sorted is beneficial for some sorting algorithms.
148148
* Average-case scenario: this is the most common case. E.g., array items in random order for a sorting algorithm.
149149
* Worst-case scenario: the inputs are arranged in such a way that causes the program to take the longest to complete. E.g., array items in reversed order for some sorting algorithm will take the longest to run.
150150

@@ -154,7 +154,7 @@ TIP: Big O only cares about the highest order of the run time function and the w
154154

155155
WARNING: Don't drop terms that are multiplying other terms. _O(n log n)_ is not equivalent to _O(n)_. However, _O(n + log n)_ is.
156156

157-
There are many common notations like polynomial, _O(n^2^)_ like we saw in the `getMin` example; constant _O(1)_ and many more that we are going to explore in the next chapter.
157+
There are many common notations like polynomial, _O(n^2^)_ as we saw in the `getMin` example; constant _O(1)_ and many more that we are going to explore in the next chapter.
158158

159159
Again, time complexity is not a direct measure of how long a program takes to execute, but rather how many operations it performs given the input size. Nevertheless, there’s a relationship between time complexity and clock time as we can see in the following table.
160160
(((Tables, Intro, Input size vs clock time by Big O)))

‎book/content/part01/big-o-examples.asc

Lines changed: 11 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ endif::[]
77

88
There are many kinds of algorithms. Most of them fall into one of the eight time complexities that we are going to explore in this chapter.
99

10-
.Eight Running Time complexity You Should Know
10+
.Eight Running Time Complexities You Should Know
1111
- Constant time: _O(1)_
1212
- Logarithmic time: _O(log n)_
1313
- Linear time: _O(n)_
@@ -17,7 +17,7 @@ There are many kinds of algorithms. Most of them fall into one of the eight time
1717
- Exponential time: _O(2^n^)_
1818
- Factorial time: _O(n!)_
1919

20-
We a going to provide examples for each one of them.
20+
We are going to provide examples for each one of them.
2121

2222
Before we dive in, here’s a plot with all of them.
2323

@@ -30,7 +30,7 @@ The above chart shows how the running time of an algorithm is related to the amo
3030
==== Constant
3131
(((Constant)))
3232
(((Runtime, Constant)))
33-
Represented as *O(1)*, it means that regardless of the input size the number of operations executed is always the same. Let’s see an example.
33+
Represented as *O(1)*, it means that regardless of the input size, the number of operations executed is always the same. Let’s see an example:
3434

3535
[#constant-example]
3636
===== Finding if an array is empty
@@ -47,7 +47,7 @@ include::{codedir}/runtimes/01-is-empty.js[tag=isEmpty]
4747

4848
Another more real life example is adding an element to the begining of a <<part02-linear-data-structures#linked-list>>. You can check out the implementation <<part02-linear-data-structures#linked-list-inserting-beginning, here>>.
4949

50-
As you can see, in both examples (array and linked list) if the input is a collection of 10 elements or 10M it would take the same amount of time to execute. You can't get any more performant than this!
50+
As you can see in both examples (array and linked list), if the input is a collection of 10 elements or 10M, it would take the same amount of time to execute. You can't get any more performant than this!
5151

5252
[[logarithmic]]
5353
==== Logarithmic
@@ -68,7 +68,7 @@ The binary search only works for sorted lists. It starts searching for an elemen
6868
include::{codedir}/runtimes/02-binary-search.js[tag=binarySearchRecursive]
6969
----
7070

71-
This binary search implementation is a recursive algorithm, which means that the function `binarySearch` calls itself multiple times until the solution is found. The binary search splits the array in half every time.
71+
This binary search implementation is a recursive algorithm, which means that the function `binarySearchRecursive` calls itself multiple times until the solution is found. The binary search splits the array in half every time.
7272

7373
Finding the runtime of recursive algorithms is not very obvious sometimes. It requires some tools like recursion trees or the https://adrianmejia.com/blog/2018/04/24/analysis-of-recursive-algorithms/[Master Theorem]. The `binarySearch` divides the input in half each time. As a rule of thumb, when you have an algorithm that divides the data in half on each call you are most likely in front of a logarithmic runtime: _O(log n)_.
7474

@@ -92,8 +92,8 @@ include::{codedir}/runtimes/03-has-duplicates.js[tag=hasDuplicates]
9292

9393
.`hasDuplicates` has multiple scenarios:
9494
* *Best-case scenario*: first two elements are duplicates. It only has to visit two elements.
95-
* *Worst-case scenario*: no duplicated or duplicated are the last two. In either case, it has to visit every item on the array.
96-
* *Average-case scenario*: duplicates are somewhere in the middle of the collection. Only, half of the array will be visited.
95+
* *Worst-case scenario*: no duplicates or duplicates are the last two. In either case, it has to visit every item in the array.
96+
* *Average-case scenario*: duplicates are somewhere in the middle of the collection. Only half of the array will be visited.
9797

9898
As we learned before, the big O cares about the worst-case scenario, where we would have to visit every element on the array. So, we have an *O(n)* runtime.
9999

@@ -147,19 +147,19 @@ Usually they have double-nested loops, where each one visits all or most element
147147
[[quadratic-example]]
148148
===== Finding duplicates in an array (naïve approach)
149149

150-
If you remember we have solved this problem more efficiently on the <<part01-algorithms-analysis#linear, Linear>> section. We solved this problem before using an _O(n)_, let’s solve it this time with an _O(n^2^)_:
150+
If you remember, we have solved this problem more efficiently in the <<part01-algorithms-analysis#linear, Linear>> section. We solved this problem before using an _O(n)_, let’s solve it this time with an _O(n^2^)_:
151151

152152
// image:image12.png[image,width=527,height=389]
153153

154-
.Naïve implementation of has duplicates function
154+
.Naïve implementation of hasDuplicates function
155155
[source, javascript]
156156
----
157157
include::{codedir}/runtimes/05-has-duplicates-naive.js[tag=hasDuplicates]
158158
----
159159

160160
As you can see, we have two nested loops causing the running time to be quadratic. How much difference is there between a linear vs. quadratic algorithm?
161161

162-
Let’s say you want to find a duplicated middle name in a phone directory book of a city of ~1 million people. If you use this quadratic solution you would have to wait for ~12 days to get an answer [big]#🐢#; while if you use the <<part01-algorithms-analysis#linear, linear solution>> you will get the answer in seconds! [big]#🚀#
162+
Let’s say you want to find a duplicated middle name in a phone directory book of a city of ~1 million people. If you use this quadratic solution, you would have to wait for ~12 days to get an answer [big]#🐢#; while if you use the <<part01-algorithms-analysis#linear, linear solution>>, you will get the answer in seconds! [big]#🚀#
163163

164164
[[cubic]]
165165
==== Cubic
@@ -186,7 +186,7 @@ include::{codedir}/runtimes/06-multi-variable-equation-solver.js[tag=findXYZ]
186186

187187
WARNING: This is just an example, there are better ways to solve multi-variable equations.
188188

189-
As you can see three nested loops usually translates to O(n^3^). If you have a four variable equation and four nested loops it would be O(n^4^) and so on when we have a runtime in the form of _O(n^c^)_, where _c > 1_, we refer to this as a *polynomial runtime*.
189+
As you can see three nested loops usually translates to O(n^3^). If you have a four variable equation and four nested loops it would be O(n^4^) and so on. When we have a runtime in the form of _O(n^c^)_, where _c > 1_, we refer to this as a *polynomial runtime*.
190190

191191
[[exponential]]
192192
==== Exponential

‎book/content/part02/array-vs-list-vs-queue-vs-stack.asc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@ In this part of the book, we explored the most used linear data structures such
1717
* You want constant time to remove/add from extremes of the list.
1818

1919
.Use a Queue when:
20-
* You need to access your data in a first-come, first served basis (FIFO).
20+
* You need to access your data on a first-come, first served basis (FIFO).
2121
* You need to implement a <<part03-graph-data-structures#bfs-tree, Breadth-First Search>>
2222

2323
.Use a Stack when:

‎book/content/part02/array.asc

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@ TIP: Strings are a collection of Unicode characters and most of the array concep
1717

1818
.Fixed vs. Dynamic Size Arrays
1919
****
20-
Some programming languages have fixed size arrays like Java and C++. Fixed size arrays might be a hassle when your collection gets full, and you have to create a new one with a bigger size. For that, those programming languages also have built-in dynamic arrays: we have `vector` in C++ and `ArrayList` in Java. Dynamic programming languages like JavaScript, Ruby, Python use dynamic arrays by default.
20+
Some programming languages have fixed size arrays like Java and C++. Fixed size arrays might be a hassle when your collection gets full, and you have to create a new one with a bigger size. For that, those programming languages also have built-in dynamic arrays: we have `vector` in C++ and `ArrayList` in Java. Dynamic programming languages like JavaScript, Ruby, and Python use dynamic arrays by default.
2121
****
2222

2323
Arrays look like this:
@@ -29,7 +29,7 @@ Arrays are a sequential collection of elements that can be accessed randomly usi
2929

3030
==== Insertion
3131

32-
Arrays are built-in into most languages. Inserting an element is simple; you can either add them on creation time or after initialization. Below you can find an example for both cases:
32+
Arrays are built-in into most languages. Inserting an element is simple; you can either add them at creation time or after initialization. Below you can find an example for both cases:
3333

3434
.Inserting elements into an array
3535
[source, javascript]
@@ -44,7 +44,7 @@ array2[100] = 2;
4444
array2 // [empty × 3, 1, empty × 96, 2]
4545
----
4646

47-
Using the index, you can replace whatever value you want. Also, you don't have to add items next to each other. The size of the array will dynamically expand to accommodate the data. You can reference values in whatever index you like index 3 or even 100! In the `array2` we inserted 2 numbers, but the length is 101, and there are 99 empty spaces.
47+
Using the index, you can replace whatever value you want. Also, you don't have to add items next to each other. The size of the array will dynamically expand to accommodate the data. You can reference values at whatever index you like: index 3 or even 100! In `array2`, we inserted 2 numbers but the length is 101 and there are 99 empty spaces.
4848

4949
[source, javascript]
5050
----
@@ -87,7 +87,7 @@ const array = [2, 5, 1, 9, 6, 7];
8787
array.splice(1, 0, 111); // ↪️ [] <1>
8888
// array: [2, 111, 5, 1, 9, 6, 7]
8989
----
90-
<1> at the position `1`, delete `0` elements and insert `111`.
90+
<1> at position `1`, delete `0` elements and insert `111`.
9191

9292
The Big O for this operation would be *O(n)* since in worst case it would move most of the elements to the right.
9393

@@ -132,7 +132,7 @@ const array = [2, 5, 1, 9, 6, 7];
132132
array[4]; // ↪️ 6
133133
----
134134

135-
Searching by index takes constant time, *O(1)*, to retrieve values out of the array. If we want to get fancier we can create a function:
135+
Searching by index takes constant time - *O(1)* - to retrieve values out of the array. If we want to get fancier, we can create a function:
136136

137137
// image:image17.png[image,width=528,height=293]
138138

@@ -184,7 +184,7 @@ We would have to loop through the whole array (worst case) or until we find it:
184184

185185
==== Deletion
186186

187-
Deleting (similar to insertion) there are three possible scenarios, removing at the beginning, middle or end.
187+
There are three possible scenarios for deletion (similar to insertion): removing at the beginning, middle or end.
188188

189189
===== Deleting element from the beginning
190190

@@ -223,7 +223,7 @@ array.splice(2, 1); // ↪️[2] <1>
223223
----
224224
<1> delete 1 element at position 2
225225

226-
Deleting from the middle might cause most the elements of the array to move back one position to fill in for the eliminated item. Thus, runtime: O(n).
226+
Deleting from the middle might cause most of the elements of the array to move up one position to fill in for the eliminated item. Thus, runtime: O(n).
227227

228228
===== Deleting element from the end
229229

@@ -237,7 +237,7 @@ array.pop(); // ↪️111
237237
// array: [2, 5, 1, 9]
238238
----
239239

240-
No element other element has been shifted, so it’s an _O(1)_ runtime.
240+
No other element has been shifted, so it’s an _O(1)_ runtime.
241241

242242
.JavaScript built-in `array.pop`
243243
****
@@ -264,7 +264,7 @@ To sum up, the time complexity of an array is:
264264
(((Runtime, Constant)))
265265
(((Tables, Linear DS, JavaScript Array buit-in operations Complexities)))
266266

267-
.Array Operations timex complexity
267+
.Array Operations time complexity
268268
|===
269269
| Operation | Time Complexity | Usage
270270
| push ^| O(1) | Insert element to the right side.

‎book/content/part02/linked-list.asc

Lines changed: 20 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -12,18 +12,18 @@ A list (or Linked List) is a linear data structure where each node is "linked" t
1212

1313
.Linked Lists can be:
1414
- Singly: every item has a pointer to the next node
15-
- Doubly: every node has a reference to the next and previous object
15+
- Doubly: every node has a reference to the next and previous node
1616
- Circular: the last element points to the first one.
1717

1818
[[singly-linked-list]]
1919
==== Singly Linked List
2020

21-
Each element or node is *connected* to the next one by a reference. When a node only has one connection it's called *singly linked list*:
21+
Each element or node is *connected* to the next one by a reference. When a node only has one connection, it's called a *singly linked list*:
2222

2323
.Singly Linked List Representation: each node has a reference (blue arrow) to the next one.
2424
image::image19.png[image,width=498,height=97]
2525

26-
Usually, a Linked List is referenced by the first element in called *head* (or *root* node). For instance, if you want to get the `cat` element from the example above, then the only way to get there is using the `next` field on the head node. You would get `art` first, then use the next field recursively until you eventually get the `cat` element.
26+
Usually, a Linked List is referenced by the first element called *head* (or *root* node). For instance, if you want to get the `cat` element from the example above, then the only way to get there is using the `next` field on the head node. You would get `art` first, then use the next field recursively until you eventually get the `cat` element.
2727

2828
[[doubly-linked-list]]
2929
==== Doubly Linked List
@@ -33,7 +33,7 @@ When each node has a connection to the `next` item and also the `previous` one,
3333
.Doubly Linked List: each node has a reference to the next and previous element.
3434
image::image20.png[image,width=528,height=74]
3535

36-
With a doubly list you can not only move forward but also backward. If you keep the reference to the last element (`cat`) you can step back and reach the middle part.
36+
With a doubly list, you can not only move forward but also backward. If you keep the reference to the last element (`cat`) you can step back and reach the middle part.
3737

3838
If we implement the code for the `Node` elements, it would be something like this:
3939

@@ -47,13 +47,13 @@ include::{codedir}/data-structures/linked-lists/node.js[tag=snippet]
4747

4848
==== Linked List vs. Array
4949

50-
Arrays allow you to access data anywhere in the collection using an index. However, Linked List visits nodes in sequential order. In the worst case scenario, it takes _O(n)_ to get an element from a Linked List. You might be wondering: Isn’t always an array more efficient with _O(1)_ access time? It depends.
50+
Arrays allow you to access data anywhere in the collection using an index. However, Linked List visits nodes in sequential order. In the worst case scenario, it takes _O(n)_ to get an element from a Linked List. You might be wondering: Isn’t an array always more efficient with _O(1)_ access time? It depends.
5151

52-
We also have to understand the space complexity to see the trade-offs between arrays and linked lists. An array pre-allocates contiguous blocks of memory. When it is getting full, it has to create a bigger array (usually 2x) and copy all the elements. It takes _O(n)_ to copy all the items over. On the other hand, LinkedList’s nodes only reserve precisely the amount of memory it needs. They don’t have to be next to each other, nor large chunks of memory have to be booked beforehand like arrays. Linked List is more on a "grow as you go" basis.
52+
We also have to understand the space complexity to see the trade-offs between arrays and linked lists. An array pre-allocates contiguous blocks of memory. When it is getting full, it has to create a bigger array (usually 2x) and copy all the elements. It takes _O(n)_ to copy all the items over. On the other hand, LinkedList’s nodes only reserve precisely the amount of memory they need. They don’t have to be next to each other, nor large chunks of memory have to be booked beforehand like arrays. Linked List is more on a "grow as you go" basis.
5353

5454
Another difference is that adding/deleting at the beginning on an array takes O(n); however, the linked list is a constant operation O(1) as we will implement later.
5555

56-
A drawback of a linked list is that if you want to insert/delete an element at the end of the list, you would have to navigate the whole collection to find the last one O(n). However, this can be solved by keeping track of the last element in the list. We are going to implement that!
56+
A drawback of a linked list is that if you want to insert/delete an element at the end of the list, you would have to navigate the whole collection to find the last one: O(n). However, this can be solved by keeping track of the last element in the list. We are going to implement that!
5757

5858
==== Implementing a Linked List
5959

@@ -74,7 +74,7 @@ In our constructor, we keep a reference of the `first` and also `last` node for
7474

7575
==== Searching by value
7676

77-
Finding an element by value there’s no other way than iterating through the whole list.
77+
There’s no other way to find an element by value than iterating through the entire list.
7878

7979
.Linked List's searching by values
8080
[source, javascript]
@@ -109,7 +109,7 @@ Searching by index is very similar, we iterate through the list until we find th
109109
include::{codedir}/data-structures/linked-lists/linked-list.js[tag=searchByIndex, indent=0]
110110
----
111111

112-
If there’s no match, we return `undefined` then. The runtime is _O(n)_. As you might notice the search by index and by position methods looks pretty similar. If you want to take a look at the whole implementation https://github.com/amejiarosario/dsa.js/blob/7694c20d13f6c53457ee24fbdfd3c0ac57139ff4/src/data-structures/linked-lists/linked-list.js#L8[click here].
112+
If there’s no match, we return `undefined` then. The runtime is _O(n)_. As you might notice, the search by index and by position methods looks pretty similar. If you want to take a look at the whole implementation, https://github.com/amejiarosario/dsa.js/blob/7694c20d13f6c53457ee24fbdfd3c0ac57139ff4/src/data-structures/linked-lists/linked-list.js#L8[click here].
113113

114114
==== Insertion
115115

@@ -162,7 +162,7 @@ For inserting an element at the middle of the list, you would need to specify th
162162
. New node's next `previous`.
163163

164164

165-
Let’s do an example, with the following doubly linked list:
165+
Let’s do an example with the following doubly linked list:
166166

167167
----
168168
art <-> dog <-> cat
@@ -181,14 +181,14 @@ Take a look into the implementation of https://github.com/amejiarosario/dsa.js/b
181181
include::{codedir}/data-structures/linked-lists/linked-list.js[tag=addMiddle, indent=0]
182182
----
183183
<1> If the new item goes to position 0, then we reuse the `addFirst` method, and we are done!
184-
<2> However, If we are adding to the last position, then we reuse the `addLast` method, and done!
184+
<2> However, if we are adding to the last position, then we reuse the `addLast` method, and done!
185185
<3> Adding `newNode` to the middle: First, create the `new` node only if the position exists. Take a look at <<Searching by index>> to see `get` implementation.
186186
<4> Set newNode `previous` reference.
187187
<5> Set newNode `next` link.
188188
<6> No other node in the list is pointing to `newNode`, so we have to make the prior element point to `newNode`.
189189
<7> Make the next element point to `newNode`.
190190

191-
Take notice that we reused, `addFirst` and `addLast` methods. For all the other cases the insertion is in the middle. We use `current.previous.next` and `current.next` to update the surrounding elements and make them point to the new node. Inserting on the middle takes *O(n)* because we have to iterate through the list using the `get` method.
191+
Take notice that we reused `addFirst` and `addLast` methods. For all the other cases, the insertion is in the middle. We use `current.previous.next` and `current.next` to update the surrounding elements and make them point to the new node. Inserting in the middle takes *O(n)* because we have to iterate through the list using the `get` method.
192192

193193
==== Deletion
194194

@@ -201,25 +201,25 @@ Deleting the first element (or head) is a matter of removing all references to i
201201
.Deleting an element from the head of the list
202202
image::image26.png[image,width=528,height=74]
203203

204-
For instance, to remove the head (“art”) node, we change the variable `first` to point to the second node “dog”. We also remove the variable `previous` from the "dog" node, so it doesn't point to the “art” node. The garbage collector will get rid of the “art” node when it seems nothing is using it anymore.
204+
For instance, to remove the head (“art”) node, we change the variable `first` to point to the second node “dog”. We also remove the variable `previous` from the "dog" node, so it doesn't point to the “art” node. The garbage collector will get rid of the “art” node when it sees nothing is using it anymore.
205205

206206
.Linked List's remove from the beginning of the list
207207
[source, javascript]
208208
----
209209
include::{codedir}/data-structures/linked-lists/linked-list.js[tag=removeFirst, indent=0]
210210
----
211211

212-
As you can see, when we want to remove the first node we make the 2nd element the first one.
212+
As you can see, when we want to remove the first node, we make the 2nd element the first one.
213213

214214
===== Deleting element from the tail
215215

216-
Removing the last element from the list would require to iterate from the head until we find the last one, that’s O(n). But, If we have a reference to the last element, which we do, We can do it in _O(1)_ instead!
216+
Removing the last element from the list would require to iterate from the head until we find the last one, that’s O(n). But, if we have a reference to the last element, which we do, we can do it in _O(1)_ instead!
217217

218218
.Removing last element from the list using the last reference.
219219
image::image27.png[image,width=528,height=221]
220220

221221

222-
For instance, if we want to remove the last node “cat”. We use the last pointer to avoid iterating through the whole list. We check `last.previous` to get the “dog” node and make it the new `last` and remove its next reference to “cat”. Since nothing is pointing to “cat” then is out of the list and eventually is deleted from memory by the garbage collector.
222+
For instance, if we want to remove the last node “cat”. We use the last pointer to avoid iterating through the whole list. We check `last.previous` to get the “dog” node and make it the new `last` and remove its next reference to “cat”. Since nothing is pointing to “cat”, it is out of the list and eventually is deleted from memory by the garbage collector.
223223

224224
.Linked List's remove from the end of the list
225225
[source, javascript]
@@ -238,7 +238,7 @@ To remove a node from the middle, we make the surrounding nodes to bypass the on
238238
image::image28.png[image,width=528,height=259]
239239

240240

241-
In the illustration, we are removing the middle node “dog” by making art’s `next` variable to point to cat and cat’s `previous` to be “art” totally bypassing “dog”.
241+
In the illustration, we are removing the middle node “dog” by making art’s `next` variable to point to cat and cat’s `previous` to be “art”, totally bypassing “dog”.
242242

243243
Let’s implement it:
244244

@@ -261,14 +261,14 @@ So far, we have seen two liner data structures with different use cases. Here’
261261
.2+.^s| Data Structure 2+^s| Searching By 3+^s| Inserting at the 3+^s| Deleting from .2+.^s| Space
262262
^|_Index/Key_ ^|_Value_ ^|_beginning_ ^|_middle_ ^|_end_ ^|_beginning_ ^|_middle_ ^|_end_
263263
| Array ^|O(1) ^|O(n) ^|O(n) ^|O(n) ^|O(1) ^|O(n) ^|O(n) ^|O(1) ^|O(n)
264-
| Linked List (singly) ^|O(n) ^|O(n) ^|O(1) ^|O(n) ^|O(1) ^|O(1) ^|O(n) ^|*O(n)* ^|O(n)
264+
| Linked List (singly) ^|O(n) ^|O(n) ^|O(1) ^|O(n) ^|O(n) ^|O(1) ^|O(n) ^|*O(n)* ^|O(n)
265265
| Linked List (doubly) ^|O(n) ^|O(n) ^|O(1) ^|O(n) ^|O(1) ^|O(1) ^|O(n) ^|*O(1)* ^|O(n)
266266
|===
267267
// end::table[]
268268
(((Linear)))
269269
(((Runtime, Linear)))
270270

271-
If you compare the singly linked list vs. doubly linked list, you will notice that the main difference is deleting elements from the end. For a singly list is *O(n)*, while for a doubly list is *O(1)*.
271+
If you compare the singly linked list vs. doubly linked list, you will notice that the main difference is inserting elements to and deleting elements from the end. For a singly linked list, it's *O(n)*, while a doubly linked list is *O(1)*.
272272

273273
Comparing an array with a doubly linked list, both have different use cases:
274274

@@ -284,4 +284,4 @@ Use a doubly linked list when:
284284
* You want to insert elements at the start and end of the list. The linked list has O(1) while array has O(n).
285285
* You want to save some memory when dealing with possibly large data sets. Arrays pre-allocate a large chunk of contiguous memory on initialization. Lists are more “grow as you go”.
286286

287-
For the next two linear data structures <<part02-linear-data-structures#stack>> and <<part02-linear-data-structures#queue>>, we are going to use a doubly linked list to implement them. We could use an array as well, but since inserting/deleting from the start perform better on linked-list, we are going use that.
287+
For the next two linear data structures <<part02-linear-data-structures#stack>> and <<part02-linear-data-structures#queue>>, we are going to use a doubly linked list to implement them. We could use an array as well, but since inserting/deleting from the start performs better with linked-lists, we are going use that.

‎book/content/part02/queue.asc

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -24,15 +24,15 @@ We could use an array or a linked list to implement a Queue. However, it is reco
2424
[source, javascript]
2525
----
2626
include::{codedir}/data-structures/queues/queue.js[tag=constructor]
27-
// ... methods goes here ...
27+
// ... methods go here ...
2828
}
2929
----
3030

3131
We initialize the Queue creating a linked list. Now, let’s add the `enqueue` and `dequeue` methods.
3232

3333
==== Insertion
3434
(((Enqueue)))
35-
For inserting elements on queue, also know as *enqueue*, we add items to the back of the list using `addLast`:
35+
For inserting elements into a queue, also know as *enqueue*, we add items to the back of the list using `addLast`:
3636

3737
.Queue's enqueue
3838
[source, javascript]
@@ -44,7 +44,7 @@ As discussed, this operation has a constant runtime.
4444

4545
==== Deletion
4646
(((Dequeue)))
47-
For removing elements from a queue, also know as *dequeue*, we remove elements from the front of the list using `removeFirst`:
47+
For removing elements from a queue, also known as *dequeue*, we remove elements from the front of the list using `removeFirst`:
4848

4949
.Queue's dequeue
5050
[source, javascript]
@@ -64,7 +64,7 @@ We can use our Queue class like follows:
6464
include::{codedir}/data-structures/queues/queue.js[tag=snippet, indent=0]
6565
----
6666

67-
You can see that the items are dequeue in the same order they were added, FIFO (first-in, first out).
67+
You can see that the items are dequeued in the same order they were added, FIFO (first-in, first out).
6868

6969
==== Queue Complexity
7070

‎book/content/part02/stack.asc

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -11,16 +11,16 @@ endif::[]
1111
(((LIFO)))
1212
The stack is a data structure that restricts the way you add and remove data. It only allows you to insert and retrieve in a *Last-In-First-Out* (LIFO) fashion.
1313

14-
An analogy is to think the stack is a rod and the data are discs. You can only take out the last one you put in.
14+
An analogy is to think that the stack is a rod and the data are discs. You can only take out the last one you put in.
1515

1616
.Stack data structure is like a stack of disks: the last element in is the first element out
1717
image::image29.png[image,width=240,height=238]
1818

1919
// #Change image from https://www.khanacademy.org/computing/computer-science/algorithms/towers-of-hanoi/a/towers-of-hanoi[Khan Academy]#
2020

21-
As you can see in the image above, If you insert the disks in the order `5`, `4`, `3`, `2`, `1`. Then you can remove them on `1`, `2`, `3`, `4`, `5`.
21+
As you can see in the image above, If you insert the disks in the order `5`, `4`, `3`, `2`, `1`, then you can remove them in `1`, `2`, `3`, `4`, `5`.
2222

23-
The stack inserts items to the end of the collection and also removes from the end. Both, an array and linked list would do it in constant time. However, since we don’t need the Array’s random access, a linked list makes more sense.
23+
The stack inserts items to the end of the collection and also removes from the end. Both an array and linked list would do it in constant time. However, since we don’t need the Array’s random access, a linked list makes more sense.
2424

2525
.Stack's constructor
2626
[source, javascript]
@@ -84,4 +84,4 @@ Implementing the stack with an array and linked list would lead to the same time
8484
|===
8585
// end::table[]
8686

87-
It's not very common to search for values on a stack (other Data Structures are better suited for this). Stacks especially useful for implementing <<part03-graph-data-structures#dfs-tree, Depth-First Search>>.
87+
It's not very common to search for values on a stack (other Data Structures are better suited for this). Stacks are especially useful for implementing <<part03-graph-data-structures#dfs-tree, Depth-First Search>>.

‎book/content/preface.asc

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -3,15 +3,15 @@
33

44
=== What is in this book?
55

6-
_{doctitle}_ is a book that can be read from cover to cover, where each section builds on top of the previous one. Also, it can be used as a reference manual where developers can refresh specific topics before an interview or looking for ideas to solve a problem optimally. (Check out the <<a-time-complexity-cheatsheet#,Time Complexity Cheatsheet>> and <<index#, topical index>>)
6+
_{doctitle}_ is a book that can be read from cover to cover, where each section builds on top of the previous one. Also, it can be used as a reference manual where developers can refresh specific topics before an interview or look for ideas to solve a problem optimally. (Check out the <<a-time-complexity-cheatsheet#,Time Complexity Cheatsheet>> and <<index#, topical index>>)
77

8-
This publication is designed to be concise, intending to serve software developers looking to get a firm conceptual understanding of data structures in a quick yet in-depth fashion. After reading this book, the reader should have a fundamental knowledge of algorithms, including when and where to apply it, what are the trade-offs of using one data structure over the other. The reader will then be able to make intelligent decisions about algorithms and data structures in their projects require.
8+
This publication is designed to be concise, intending to serve software developers looking to get a firm conceptual understanding of data structures in a quick yet in-depth fashion. After reading this book, the reader should have a fundamental knowledge of algorithms, including when and where to apply it, what are the trade-offs of using one data structure over the other. The reader will then be able to make intelligent decisions about algorithms and data structures in their projects.
99

1010
=== Who this book is for
1111

1212
This book is for software developers familiar with JavaScript looking to improve their problem-solving skills or preparing for a job interview.
1313

14-
NOTE: You can apply the concepts in this book to any programming language. However, instead of doing examples in pseudo-code we are going to use JavaScript to implement the code examples.
14+
NOTE: You can apply the concepts in this book to any programming language. However, instead of doing examples in pseudo-code, we are going to use JavaScript to implement the code examples.
1515

1616
=== What you need for this book
1717

‎book/part02-linear-data-structures.asc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@
33

44
Data Structures comes in many flavors. There’s no one to rule them all. You have to know the tradeoffs so you can choose the right one for the job.
55

6-
Even though in your day-to-day, you might not need to re-implementing them, knowing how they work internally would help you how when to use over the other or even tweak them to create a new one. We are going to explore the most common data structures time and space complexity.
6+
Even though in your day-to-day, you might not need to re-implementing them, knowing how they work internally would help you know when to use one over the other or even tweak them to create a new one. We are going to explore the most common data structures' time and space complexity.
77

88
.In this part we are going to learn about the following linear data structures:
99
- <<part02-linear-data-structures#array>>

‎package.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
{
22
"name": "dsa.js",
3-
"version": "1.3.9",
3+
"version": "1.3.10",
44
"description": "Data Structures & Algorithms in JS",
55
"author": "Adrian Mejia <hi+dsajs@adrianmejia.com> (https://adrianmejia.com)",
66
"homepage": "https://github.com/amejiarosario/dsa.js",

‎src/data-structures/queues/queue.js

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@ const LinkedList = require('../linked-lists/linked-list');
22

33
// tag::constructor[]
44
/**
5-
* Data structure where add and remove elements in a first-in, first-out (FIFO)
5+
* Data structure where we add and remove elements in a first-in, first-out (FIFO) fashion
66
*/
77
class Queue {
88
constructor() {

0 commit comments

Comments
 (0)
Please sign in to comment.