Skip to content

Commit 32cf3af

Browse files
committedMar 28, 2019
improve first part
1 parent 8f519dc commit 32cf3af

12 files changed

+133
-128
lines changed
 

‎book/chapters/algorithmic-toolbox.adoc

Lines changed: 13 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -3,28 +3,30 @@
33
Have you ever given a programming problem and freeze without knowing where to start?
44
Well, in this section we are going to give some tips, so you don't get stuck while coding.
55

6-
TIP: Don't start coding right away. First, solve the problem, then write the code.
6+
TIP: TL;DR: Don't start coding right away. First, solve the problem, then write the code. Make it work first, make it better later.
77

88
.Steps to solve algorithmic problems
99
. *Understand* the requirements. Reframe it in your own words.
1010
. Draw a *simple example* (no edge cases yet)
11-
. Brainstorm
11+
. *Brainstorm* possible solutions
1212
.. How would you solve this problem *manually*? (without a computer) Is there any formula or theorem you can use?
1313
.. Is there any heuristics (largest, smallest, best ratio) or can you spot a pattern to solve this problem using a <<Greedy Algorithms, greedy algorithm>>?
14-
.. Can you address the simple base case and generalize for other cases using a *recursive solution*?
14+
.. Can you address the simple base case and generalize for other cases using a *recursive solution*? Can you divide the problem in subproblems? Try <<Divide and Conquer>>.
1515
.. Do you have to generate multiple solutions or try different paths? Try <<Backtracking>>.
16+
.. List all the data structures that you know that might solve this problem.
1617
.. If anything else fails, how would you solve it the dumbest way possible (brute force). We can optimize it later.
17-
. Optimize the solution.
18+
. *Test* your algorithm idea with multiple examples
19+
. *Optimize* the solution –Only optimize when you have something working don't try to do both at the same time!
20+
.. Can you trade-off space for speed? Use a <<HashMap>> to speed up results!
21+
.. Do you have a bunch of recursive and overlapping problems? Try <<Dynamic Programming>>.
1822
.. Re-read requirements and see if you can take advantage of anything. E.g. is the array sorted?
19-
.. Do you have a bunch of overlapping problems? Try <<Dynamic Programming>>.
20-
.. Can you trade-off space for speed? Use a <<HashMap, Map>> to speed up results
21-
. Test your algorithm with multiple examples
22-
. *Code*, yes, now you can code.
23+
. *Write Code*, yes, now you can code.
2324
.. Modularize your code with functions (don't do it all in one giant function please 🙏)
24-
. Test your code.
25+
.. Comment down edge cases but don't address until the basic cases are working.
26+
. *Test* your code.
2527
.. Choose a typical input and test against your code.
26-
.. Brainstorm about edge cases (empty, null values, overflows,
27-
.. How would scale your code?
28+
.. Brainstorm about edge cases (empty, null values, overflows, largest supported inputs)
29+
.. How would scale your code beyond the current boundaries?
2830

2931
These steps should get you going even with the toughest algorithmic problems.
3032

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
11
[partintro]
22
--
3-
In this section we are going to cover the basics about algorithms analysis. Also, we are going to discuss eight of the most commmon runtimes of algorithms and provide a code example for each one.
3+
In this part, we are going to cover the basics of algorithms analysis. Also, we are going to discuss the most common runtimes of algorithms and provide a code example for each one.
44
--

‎book/chapters/algorithms-analysis.adoc

Lines changed: 14 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ But why stop with the running time?
1313
We could also compare the memory "used" by different algorithms, and we called that *space complexity*.
1414

1515
.In this chapter you will learn:
16-
- What’s the best way to measure your code performance.
16+
- What’s the best way to measure the performance of your code regardless of what hardware you use.
1717
- Learn how to use Big O notation to compare algorithms.
1818
- How to use algorithms analysis to improve your programs speed.
1919
@@ -24,15 +24,17 @@ Before going deeper, into space and time complexity, let's cover the basics real
2424
Algorithms (as you might know) are steps of how to do some task. When you cook, you follow a recipe (or an algorithm) to prepare a dish. Let's say you want to make a pizza.
2525

2626
.Example of an algorithm
27-
// [source, js] // undefined functions
27+
[source, javascript]
2828
----
29-
function bakePizza(dough, toppins = ['cheese']) {
30-
const heatedOven = heatOvenTo(550);
31-
punchDown(dough);
32-
rollOut(dough);
33-
applyToppings(dough, toppings);
34-
const pizza = heatedOven.bakePizza(dough)
35-
return pizza;
29+
import { punchDown, rollOut, applyToppings, Oven } from '../pizza-utils';
30+
31+
function makePizza(dough, toppins = ['cheese']) {
32+
const oven = new Oven(450);
33+
const punchedDough = punchDown(dough);
34+
const rolledDough = rollOut(punchedDough);
35+
const rawPizza = applyToppings(rolledDough, toppings);
36+
const pizzaPromise = oven.bake(rawPizza, { minutes: 20 });
37+
return pizzaPromise;
3638
}
3739
----
3840

@@ -44,8 +46,6 @@ TIP: Algorithms are instructions on how to perform a task.
4446

4547
Not all algorithms are created equal. There are “good” and “bad” algorithms. The good ones are fast; the bad ones are slow. Slow algorithms cost more money to run. Inefficient algorithms could make some calculations impossible in our lifespan!
4648

47-
Most algorithms are affected by the size of the input. Let's say you need to arrange numbers in ascending order. Sorting ten digits will naturally take much less time than sorting 2 million of them.
48-
4949
To give you a clearer picture of how different algorithms perform as the input size grows, take a look at the following table.
5050

5151
.Relationship between algorithm input size and time taken to complete
@@ -59,8 +59,9 @@ To give you a clearer picture of how different algorithms perform as the input s
5959
|Find all permutations of a string |4 sec. |> vigintillion years |> centillion years |∞ |∞
6060
|=============================================================================================
6161

62-
indexterm:(((Permutations)))
63-
However, if you keep the input size constant, you can notice the difference between an efficient algorithm and a slow one. An excellent sorting algorithm is `mergesort` for instance, and inefficient algorithm for large inputs is `bubble sort` .
62+
Most algorithms are affected by the size of the input (`n`). Let's say you need to arrange numbers in ascending order. Sorting ten digits will naturally take much less time than sorting out 2 million. But, how much longer? Some algorithms as the input size grow they take proportionally more time, we classify them as <<Linear, linear>> runtime [or `O(n)`]. Others might take power two longer; we call them <<Quadratic, quadratic>> running time [or `O(n^2^)`].
63+
64+
If you keep the input size the same, and run diffferent algorithms implementations you would notice the difference between an efficient algorithm and a slow one. An excellent sorting algorithm is `mergesort` for instance, and inefficient algorithm for large inputs is `bubble sort` .
6465
Organizing 1 million elements with merge sort takes 20 seconds while bubble sort takes 12 days, ouch!
6566
The amazing thing is that both programs are measured on the same hardware with the same data!
6667

‎book/chapters/array.adoc

Lines changed: 62 additions & 25 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,13 @@
11
= Array
22
(((Array)))
33
(((Data Structures, Linear, Array)))
4-
Arrays are one of the most used data structures. You probably have used it a lot but are you aware of the runtimes of `splice`, `shift` and other operations? In this chapter, we are going deeper into the most common operations and their runtimes.
4+
Arrays are one of the most used data structures. You probably have used it a lot but are you aware of the runtimes of `splice`, `shift`, `indexOf` and other operations? In this chapter, we are going deeper into the most common operations and their runtimes.
55

66
== Array Basics
77

8-
An array is a collection of things (strings, characters, numbers, objects, etc.). They can be many or zero. Strings are a collection of Unicode characters and most of the array concepts apply to them.
8+
An array is a collection of things (strings, characters, numbers, objects, etc.). They can be many or zero.
9+
10+
TIP: Strings are a collection of Unicode characters and most of the array concepts apply to them.
911

1012
.Fixed vs. Dynamic Size Arrays
1113
****
@@ -34,7 +36,7 @@ const array0 = [];
3436
array0[2] = 1;
3537
----
3638

37-
Using the index, you can replace whatever value you want.
39+
Using the index, you can replace whatever value you want. The runtime is constant: _O(1)_.
3840

3941
=== Inserting at the beginning of the array
4042

@@ -43,14 +45,18 @@ What if you want to insert a new element at the beginning of the array? You woul
4345
.Insert to head
4446
[source, javascript]
4547
----
46-
array.unshift(0); //=> [0, 2, 5, 1, 9, 6, 7]
48+
const array = [2, 5, 1, 9, 6, 7];
49+
array.unshift(0); // ↪️ 8
50+
// array: [0, 2, 5, 1, 9, 6, 7]
4751
----
4852

4953
As you can see, `2` was the index 0, now was pushed to index 1, and everything else was moved one place. `unshift` takes *O(n)* since it affects all the elements in the array.
5054

5155
.JavaScript built-in `array.unshift`
5256
****
53-
The `unshift()` method adds one or more elements to the beginning of an array and returns the new length of the array. Runtime: O(n).
57+
The `unshift()` method adds one or more elements to the beginning of an array and returns the new length of the array.
58+
59+
Runtime: O(n).
5460
****
5561

5662
=== Inserting at the middle of the array
@@ -60,15 +66,19 @@ Inserting a new element in the middle involves moving part of the array but not
6066
.Inserting element in the middle
6167
[source, javascript]
6268
----
63-
array.splice(1, 0, 111); // <1>
69+
const array = [2, 5, 1, 9, 6, 7];
70+
array.splice(1, 0, 111); // ↪️ [] <1>
71+
// array: [2, 111, 5, 1, 9, 6, 7]
6472
----
65-
<1> at the position 1, delete 0 elements and insert 111. The array would be `[2, 111, 5, 1, 9, 6, 7]`
73+
<1> at the position `1`, delete `0` elements and insert `111`.
6674

6775
The Big O for this operation would be *O(n)* since in worst case it would move most of the elements to the right.
6876

6977
.JavaScript built-in `array.splice`
7078
****
71-
The `splice()` method changes the contents of an array by removing existing elements and/or adding new elements. Runtime: O(n).
79+
The `splice()` method changes the contents of an array by removing existing elements and/or adding new elements. Returns an array containing the deleted elements.
80+
81+
Runtime: O(n).
7282
****
7383

7484
=== Inserting at the end of the array
@@ -79,15 +89,18 @@ We can push new values to the end of the array like this:
7989
[source, javascript]
8090
----
8191
const array = [2, 5, 1, 9, 6, 7];
82-
array.push(4); // <1>
92+
array.push(4); // ↪️ 7 <1>
93+
// array: [2, 5, 1, 9, 6, 7, 4]
8394
----
84-
<1> The `4` element would be pushed to the end `[2, 5, 1, 9, 6, 7, 4]`.
95+
<1> The `4` element would be pushed to the end of the array. Notice that `push` returns the new length of the array.
8596

8697
Adding to the tail of the array doesn’t change other indexes. E.g., element 2 is still at index 0. So, this is a constant time operation *O(1)*.
8798

8899
.JavaScript built-in `array.push`
89100
****
90-
The `push()` method adds one or more elements to the end of an array and returns the new length of the array. Runtime: O(1).
101+
The `push()` method adds one or more elements to the end of an array and returns the new length of the array.
102+
103+
Runtime: O(1).
91104
****
92105

93106
== Searching by value and index
@@ -98,7 +111,7 @@ Searching by index is very easy using the `[]` operator:
98111
[source, javascript]
99112
----
100113
const array = [2, 5, 1, 9, 6, 7];
101-
array[4]; //↪️ 6
114+
array[4]; // ↪️ 6
102115
----
103116

104117
Searching by index takes a constant time, *O(1)*, to retrieve values out of the array. If we want to get fancier we can create a function:
@@ -111,7 +124,7 @@ Searching by index takes a constant time, *O(1)*, to retrieve values out of the
111124
include::{codedir}/data-structures/arrays/array.js[tag=searchByIndex]
112125
----
113126

114-
Finding out if an element is in the array or not is a different story.
127+
Finding out if a value is in the array or not is a different story.
115128

116129
// image:image18.png[image,width=528,height=338]
117130

@@ -129,32 +142,40 @@ Deleting (similar to insertion) there are three possible scenarios, removing at
129142

130143
=== Deleting element from the beginning
131144

132-
Deleting from the beginning can be done using the `splice` function and also the `shift`. Let’s use the `shift` since it’s simpler.
145+
Deleting from the beginning can be done using the `splice` function and also the `shift`. For simplicity, we will use the latter.
133146

134147
.Deleting from the beginning of the array.
135148
[source, javascript]
136149
----
137-
array.shift(); //=> [5, 1, 9, 6, 7]
150+
const array = [2, 111, 5, 1, 9, 6, 7];
151+
// Deleting from the beginning of the array.
152+
array.shift(); // ↪️2
153+
array.shift(); // ↪️111
154+
// array: [5, 1, 9, 6, 7]
138155
----
139156

140-
As expected, this will make every index to change, so this takes *O(n)*.
157+
As expected, this will change every index, so this takes *O(n)*.
141158

142159
.JavaScript built-in array.shift
143160
****
144-
The `shift()` method removes the first element from an array and returns that removed element. This method changes the length of the array. Runtime: O(n).
145-
****
161+
The `shift()` method shift all elements to the left. In turn, it removes the first element from an array and returns that removed element. This method changes the length of the array.
146162
163+
Runtime: O(n).
164+
****
147165

148166
=== Deleting element from the middle
149167

150-
We can use the splice operator for this.
168+
We can use the `splice` method for deleting an item from the middle of an array.
151169

152170
.Deleting from the middle
153171
[source, javascript]
154172
----
155-
array.splice(2, 1); // delete 1 element at position 2
156-
// => array: [2, 5, 9, 6, 7]
173+
const array = [0, 1, 2, 3, 4];
174+
// Deleting from the middle
175+
array.splice(2, 1); // ↪️[2] <1>
176+
// array: [0, 1, 3, 4]
157177
----
178+
<1> delete 1 element at position 2
158179

159180
Deleting from the middle might cause most the elements of the array to move back one position to fill in for the eliminated item. Thus, runtime: O(n).
160181

@@ -165,25 +186,41 @@ Removing the last element is very straightforward:
165186
.Deleting last element from the array
166187
[source, javascript]
167188
----
168-
array.pop(); // => array: [2, 5, 1, 9, 6]
189+
const array = [2, 5, 1, 9, 111];
190+
array.pop(); // ↪️111
191+
// array: [2, 5, 1, 9, 111]
169192
----
170193

171194
No element other element has been shifted, so it’s an _O(1)_ runtime.
172195

173196
.JavaScript built-in `array.pop`
174197
****
175-
The `pop()` method removes the last element from an array and returns that element. This method changes the length of the array. Runtime: O(1).
198+
The `pop()` method removes the last element from an array and returns that element. This method changes the length of the array.
199+
200+
Runtime: O(1).
176201
****
177202

178203
== Array Complexity
179204

180205
To sum up, the time complexity on an array is:
181206

182-
.Time complexity for the array operations
207+
.Time/Space complexity for the array operations
183208
|===
184-
.2+.^s| Data Structure 2+^s| Searching By 3+^s| Inserting at the 3+^s| Deleting from .2+.^s| Space Complexity
209+
.2+.^s| Data Structure 2+^s| Searching By 3+^s| Inserting at the 3+^s| Deleting from .2+.^s| Space
185210
^|_Index/Key_ ^|_Value_ ^|_beginning_ ^|_middle_ ^|_end_ ^|_beginning_ ^|_middle_ ^|_end_
186211
| Array ^|O(1) ^|O(n) ^|O(n) ^|O(n) ^|O(1) ^|O(n) ^|O(n) ^|O(1) ^|O(n)
187212
|===
188213
(((Linear)))
189214
(((Runtime, Linear)))
215+
(((Constant)))
216+
(((Runtime, Constant)))
217+
218+
Array Operations
219+
|===
220+
| Operation | Time Complexity | Usage
221+
| push ^| O(1) | Insert element to the right side.
222+
| pop ^| O(1) | Remove the rightmost element.
223+
| unshift ^| O(n) | Insert element to the left side.
224+
| shift ^| O(n) | Remove leftmost element.
225+
| splice ^| O(n) | Insert and remove from anywhere.
226+
|===

‎book/chapters/chapter2.adoc

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -10,15 +10,19 @@ include::linear-data-structures-intro.adoc[]
1010
include::array.adoc[]
1111

1212
// (g)
13+
<<<
1314
include::linked-list.adoc[]
1415

1516
// (g)
17+
<<<
1618
include::stack.adoc[]
1719

1820
// (g)
21+
<<<
1922
include::queue.adoc[]
2023

2124
// (g)
25+
<<<
2226
include::linear-data-structures-outro.adoc[]
2327

2428
:leveloffset: -1

‎book/chapters/chapter3.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -59,6 +59,6 @@ include::graph-search.adoc[]
5959
// Graph summary
6060
= Summary
6161

62-
In this section, we learned about Graphs applications, properties and how we can implement them. We mention that you can represent a graph as a matrix or as a list of adjacencies. We went for implementing the later since it's more space efficient. We cover the basic graph operations like adding and removing nodes and edges. In the algorithms section, we are going to cover searching values in the graph.
62+
In this section, we learned about Graphs applications, properties and how we can create them. We mention that you can represent a graph as a matrix or as a list of adjacencies. We went for implementing the later since it's more space efficient. We cover the basic graph operations like adding and removing nodes and edges. In the algorithms section, we are going to cover searching values in the graph.
6363

6464
:leveloffset: -1

‎book/chapters/linear-data-structures-intro.adoc

Lines changed: 8 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,14 +1,16 @@
11
[partintro]
22
--
3-
Data Structures comes in many flavors. There’s no one to rule them all. There are tradeoffs for each one of them. Even thought in your day-to-day, you might not need to re-implementing them, knowing how they work internally would help you choose the right tool for the job. We are going to explore the most common data structures time and space complexity.
3+
Data Structures comes in many flavors. There’s no one to rule them all. You have to know the tradeoffs so you can choose the right one for the job.
4+
5+
Even though in your day-to-day, you might not need to re-implementing them, knowing how they work internally would help you how when to use over the other or even tweak them to create a new one. We are going to explore the most common data structures time and space complexity.
46

57
.In this part we are going to learn about the following linear data structures:
6-
- Array
7-
- Linked List
8-
- Stack
9-
- Queue
8+
- <<Array>>
9+
- <<Linked List>>
10+
- <<Stack>>
11+
- <<Queue>>
1012
11-
Later, in the next part we are going to explore non-linear data structures like Graphs and Trees.
13+
Later, in the next part, we are going to explore non-linear data structures like <<Graph, Graphs>> and <<Tree, Trees>>.
1214

1315
ifdef::backend-html5[]
1416
If you want to have a general overview of each one, take a look at the following interactive diagram:

‎book/chapters/linear-data-structures-outro.adoc

Lines changed: 3 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,6 @@
1-
21
= Array vs. Linked List & Queue vs. Stack
32

4-
In this chapter, we explored the most used linear data structures such as Arrays, Linked Lists, Stacks and Queues. We implemented them and discussed the runtime of their operations.
5-
6-
To sum up,
3+
In this part of the book, we explored the most used linear data structures such as Arrays, Linked Lists, Stacks and Queues. We implemented them and discussed the runtime of their operations.
74

85
.Use Arrays when…
96
* You need to access data in random order fast (using an index).
@@ -22,9 +19,9 @@ To sum up,
2219
* You need to access your data as last-in, first-out (LIFO).
2320
* You need to implement a <<Depth-First Search for Binary Tree, Depth-First Search>>
2421
25-
.Time Complexity of Linear Data Structures (Array, LinkedList, Stack & Queues)
22+
.Time/Space Complexity of Linear Data Structures (Array, LinkedList, Stack & Queues)
2623
|===
27-
.2+.^s| Data Structure 2+^s| Searching By 3+^s| Inserting at the 3+^s| Deleting from .2+.^s| Space Complexity
24+
.2+.^s| Data Structure 2+^s| Searching By 3+^s| Inserting at the 3+^s| Deleting from .2+.^s| Space
2825
^|_Index/Key_ ^|_Value_ ^|_beginning_ ^|_middle_ ^|_end_ ^|_beginning_ ^|_middle_ ^|_end_
2926
| <<Array>> ^|O(1) ^|O(n) ^|O(n) ^|O(n) ^|O(1) ^|O(n) ^|O(n) ^|O(1) ^|O(n)
3027
| <<Singly Linked List>> ^|O(n) ^|O(n) ^|O(1) ^|O(n) ^|O(1) ^|O(1) ^|O(n) ^|*O(n)* ^|O(n)

‎book/chapters/linked-list.adoc

Lines changed: 15 additions & 22 deletions
Original file line numberDiff line numberDiff line change
@@ -2,48 +2,49 @@
22
(((Linked List)))
33
(((List)))
44
(((Data Structures, Linear, Linked List)))
5-
A list (or Linked List) is a linear data structure where each node is linked to another one.
5+
A list (or Linked List) is a linear data structure where each node is "linked" to the next.
66

7-
Linked Lists can be:
7+
.Linked Lists can be:
88
- Singly: every item has a pointer to the next node
99
- Doubly: every node has a reference to the next and previous object
1010
- Circular: the last element points to the first one.
11-
We are going to explore the first two in the next sections.
11+
1212
1313
== Singly Linked List
1414

15-
Each element or node is *linked* to the next one by a reference. When a node only has the reference to the next element, it's called *singly linked list*:
15+
Each element or node is *connected* to the next one by a reference. When a node only has one connection it's called *singly linked list*:
1616

1717
.Singly Linked List Representation: each node has a reference (blue arrow) to the next one.
1818
image:image19.png[image,width=498,height=97]
1919

20-
21-
Usually, a Linked List is referenced by the first element in called *head* (or *root* node). For instance, if you want to get the `cat` element from the example above, then the only way to get there is using the next field on the head node. You would get `art` first, then use the next field recursively until you eventually get the `cat` element.
20+
Usually, a Linked List is referenced by the first element in called *head* (or *root* node). For instance, if you want to get the `cat` element from the example above, then the only way to get there is using the `next` field on the head node. You would get `art` first, then use the next field recursively until you eventually get the `cat` element.
2221

2322
== Doubly Linked List
2423

25-
When each node has a reference to the next item and also the previous one, then we have a *doubly linked list*.
24+
When each node has a connection to the `next` item and also the `previous` one, then we have a *doubly linked list*.
2625

2726
.Doubly Linked List: each node has a reference to the next and previous element.
2827
image:image20.png[image,width=528,height=74]
2928

29+
With a doubly list you can not only move forward but also backward. If you keep the reference to the last element (`cat`) you can step back and reach the middle part.
30+
3031
If we implement the code for the `Node` elements, it would be something like this:
3132

3233
// image:image21.png[image,width=528,height=285]
3334

34-
.Linked List Node
35+
.Linked List Node Implementation
3536
[source, javascript]
3637
----
3738
include::{codedir}/data-structures/linked-lists/node.js[tag=snippet]
3839
----
3940

4041
== Linked List vs. Array
4142

42-
Arrays allow you to access data anywhere in the collection using an index. However, Linked List visits nodes in sequential order. In the worst case scenario, it takes _O(n)_ to get an element from a Linked List. You might be wondering: Isn’t always an array more efficient with O(1) access time? It depends.
43+
Arrays allow you to access data anywhere in the collection using an index. However, Linked List visits nodes in sequential order. In the worst case scenario, it takes _O(n)_ to get an element from a Linked List. You might be wondering: Isn’t always an array more efficient with _O(1)_ access time? It depends.
4344

44-
We also have to understand the space complexity to see the trade-offs between arrays and linked lists. An array pre-allocates contiguous blocks of memory. When the array is getting full, it has to copy all the elements over to a new space usually 2x bigger. It takes _O(n)_ to copy all the items over. On the other hand, LinkedList’s nodes only reserve precisely the amount of memory it needs. They don’t have to be next to each other, nor large chunks of memory have to be booked beforehand like arrays. Linked List is more on a "grow as you go" basis.
45+
We also have to understand the space complexity to see the trade-offs between arrays and linked lists. An array pre-allocates contiguous blocks of memory. When it is getting full, it has to create a bigger array (usually 2x) and copy all the elements. It takes _O(n)_ to copy all the items over. On the other hand, LinkedList’s nodes only reserve precisely the amount of memory it needs. They don’t have to be next to each other, nor large chunks of memory have to be booked beforehand like arrays. Linked List is more on a "grow as you go" basis.
4546

46-
Another difference is that adding/deleting at the beginning on an array takes O(n), however, in the linked list is a constant operation O(1) as we will implement later.
47+
Another difference is that adding/deleting at the beginning on an array takes O(n); however, the linked list is a constant operation O(1) as we will implement later.
4748

4849
A drawback of a linked list is that if you want to insert/delete an element at the end of the list, you would have to navigate the whole collection to find the last one O(n). However, this can be solved by keeping track of the last element in the list. We are going to implement that!
4950

@@ -62,7 +63,7 @@ include::{codedir}/data-structures/linked-lists/linked-list.js[tag=constructor]
6263
}
6364
----
6465

65-
In our constructor, we keep a reference of the first (and last node for performance reasons).
66+
In our constructor, we keep a reference of the `first` and also `last` node for performance reasons.
6667

6768
== Searching by value
6869

@@ -77,6 +78,7 @@ include::{codedir}/data-structures/linked-lists/linked-list.js[tag=searchByValue
7778
If we find the element, we will return the index otherwise `undefined`. The runtime for locating an item by value is _O(n)_.
7879

7980
For finding elements by value or position we are using the following helper function:
81+
8082
.Find elements using a callback
8183
[source, javascript]
8284
----
@@ -134,9 +136,6 @@ Appending an element at the end of the list can be done very effectively if we h
134136
.Add element to the end of the linked list
135137
image:image24.png[image,width=498,height=208]
136138

137-
138-
In code:
139-
140139
.Linked List's add to the end of the list implementation
141140
[source, javascript]
142141
----
@@ -155,8 +154,6 @@ Let’s do an example, with a doubly linked list. We want to insert the `new` no
155154
.Inserting node in the middle of a doubly linked list.
156155
image:image25.png[image,width=528,height=358]
157156

158-
Let’s work in the code to do this:
159-
160157
.Linked List's add to the middle of the list
161158
[source, javascript]
162159
----
@@ -186,8 +183,6 @@ image:image26.png[image,width=528,height=74]
186183

187184
For instance, to remove the head (“art”) node, we change the variable `first` to point to the second node “dog”. We also remove the variable `previous` from the "dog" node, so it doesn't point to the “art” node. The garbage collector will get rid of the “art” node when it seems nothing is using it anymore.
188185

189-
In code, it looks like this:
190-
191186
.Linked List's remove from the beginning of the list
192187
[source, javascript]
193188
----
@@ -206,8 +201,6 @@ image:image27.png[image,width=528,height=221]
206201

207202
For instance, if we want to remove the last node “cat”. We use the last pointer to avoid iterating through the whole list. We check `last.previous` to get the “dog” node and make it the new `last` and remove its next reference to “cat”. Since nothing is pointing to “cat” then is out of the list and eventually is deleted from memory by the garbage collector.
208203

209-
Let’s code this up like this:
210-
211204
.Linked List's remove from the end of the list
212205
[source, javascript]
213206
----
@@ -244,7 +237,7 @@ So far, we have seen two liner data structures with different use cases. Here’
244237

245238
.Big O cheat sheet for Linked List and Array
246239
|===
247-
.2+.^s| Data Structure 2+^s| Searching By 3+^s| Inserting at the 3+^s| Deleting from .2+.^s| Space Complexity
240+
.2+.^s| Data Structure 2+^s| Searching By 3+^s| Inserting at the 3+^s| Deleting from .2+.^s| Space
248241
^|_Index/Key_ ^|_Value_ ^|_beginning_ ^|_middle_ ^|_end_ ^|_beginning_ ^|_middle_ ^|_end_
249242
| Array ^|O(1) ^|O(n) ^|O(n) ^|O(n) ^|O(1) ^|O(n) ^|O(n) ^|O(1) ^|O(n)
250243
| Linked List (singly) ^|O(n) ^|O(n) ^|O(1) ^|O(n) ^|O(1) ^|O(1) ^|O(n) ^|*O(n)* ^|O(n)

‎book/chapters/queue.adoc

Lines changed: 7 additions & 40 deletions
Original file line numberDiff line numberDiff line change
@@ -1,12 +1,14 @@
11
= Queue
22
(((Queue)))
33
(((Data Structures, Linear, Queue)))
4+
(((First-In First-out)))
5+
(((FIFO)))
46
A queue is a linear data structure where the data flows in a *First-In-First-Out* (FIFO) manner.
57

68
.Queue data structure is like a line of people: the First-in, is the First-out
79
image:image30.png[image,width=528,height=171]
810

9-
A queue is like a line of people at the bank, the person that arrived first is the first to go out as well.
11+
A queue is like a line of people at the bank; the person that arrived first is the first to go out as well.
1012

1113
Similar to the stack, we only have two operations (insert and remove). In a Queue, we add elements to the back of the list and remove it from the front.
1214

@@ -23,7 +25,7 @@ include::{codedir}/data-structures/queues/queue.js[tag=constructor]
2325
We initialize the Queue creating a linked list. Now, let’s add the `enqueue` and `dequeue` methods.
2426

2527
== Insertion
26-
28+
(((Enqueue)))
2729
For inserting elements on queue, also know as *enqueue*, we add items to the back of the list using `addLast`:
2830

2931
.Queue's enqueue
@@ -35,7 +37,7 @@ include::{codedir}/data-structures/queues/queue.js[tag=enqueue, indent=0]
3537
As discussed, this operation has a constant runtime.
3638

3739
== Deletion
38-
40+
(((Dequeue)))
3941
For removing elements from a queue, also know as *dequeue*, we remove elements from the front of the list using `removeFirst`:
4042

4143
.Queue's dequeue
@@ -63,46 +65,11 @@ You can see that the items are dequeue in the same order they were added, FIFO (
6365
As an experiment, we can see in the following table that if we had implemented the Queue using an array, its enqueue time would be _O(n)_ instead of _O(1)_. Check it out:
6466

6567

66-
.Time complexity for queue operations
68+
.Time/Space complexity for queue operations
6769
|===
68-
.2+.^s| Data Structure 2+^s| Searching By 3+^s| Inserting at the 3+^s| Deleting from .2+.^s| Space Complexity
70+
.2+.^s| Data Structure 2+^s| Searching By 3+^s| Inserting at the 3+^s| Deleting from .2+.^s| Space
6971
^|_Index/Key_ ^|_Value_ ^|_beginning_ ^|_middle_ ^|_end_ ^|_beginning_ ^|_middle_ ^|_end_
7072
| Queue (w/array) ^|- ^|- ^|- ^|- ^|*O(n)* ^|- ^|- ^|O(1) ^|O(n)
7173
| Queue (w/list) ^|- ^|- ^|- ^|- ^|O(1) ^|- ^|- ^|O(1) ^|O(n)
7274
|===
7375
indexterm:[Runtime, Linear]
74-
75-
= Summary
76-
77-
In this chapter, we explored the most used linear data structures such as Arrays, Linked Lists, Stacks and Queues. We implemented them and discussed the runtime of their operations.
78-
79-
To sum up,
80-
81-
.Use Arrays when…
82-
* You need to access data in random order fast (using an index).
83-
* Your data is multi-dimensional (e.g., matrix, tensor).
84-
85-
.Use Linked Lists when:
86-
* You will access your data sequentially.
87-
* You want to save memory and only allocate memory as you need it.
88-
* You want constant time to remove/add from extremes of the list.
89-
90-
.Use a Queue when:
91-
* You need to access your data in a first-come, first served basis (FIFO).
92-
* You need to implement a <<Breadth-First Search for Binary Tree, Breadth-First Search>>
93-
94-
.Use a Stack when:
95-
* You need to access your data as last-in, first-out (LIFO).
96-
* You need to implement a <<Depth-First Search for Binary Tree, Depth-First Search>>
97-
98-
.Time Complexity of Linear Data Structures (Array, LinkedList, Stack & Queues)
99-
|===
100-
.2+.^s| Data Structure 2+^s| Searching By 3+^s| Inserting at the 3+^s| Deleting from .2+.^s| Space Complexity
101-
^|_Index/Key_ ^|_Value_ ^|_beginning_ ^|_middle_ ^|_end_ ^|_beginning_ ^|_middle_ ^|_end_
102-
| <<Array>> ^|O(1) ^|O(n) ^|O(n) ^|O(n) ^|O(1) ^|O(n) ^|O(n) ^|O(1) ^|O(n)
103-
| <<Singly Linked List>> ^|O(n) ^|O(n) ^|O(1) ^|O(n) ^|O(1) ^|O(1) ^|O(n) ^|*O(n)* ^|O(n)
104-
| <<Doubly Linked List>> ^|O(n) ^|O(n) ^|O(1) ^|O(n) ^|O(1) ^|O(1) ^|O(n) ^|*O(1)* ^|O(n)
105-
| <<Stack>> ^|- ^|- ^|- ^|- ^|O(1) ^|- ^|- ^|O(1) ^|O(n)
106-
| Queue (w/array) ^|- ^|- ^|- ^|- ^|*O(n)* ^|- ^|- ^|O(1) ^|O(n)
107-
| <<Queue>> (w/list) ^|- ^|- ^|- ^|- ^|O(1) ^|- ^|- ^|O(1) ^|O(n)
108-
|===

‎book/chapters/stack.adoc

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,8 @@
11
= Stack
22
(((Stack)))
33
(((Data Structures, Linear, Stack)))
4+
(((Last-In First-out)))
5+
(((LIFO)))
46
The stack is a data structure that restricts the way you add and remove data. It only allows you to insert and retrieve in a *Last-In-First-Out* (LIFO) fashion.
57

68
An analogy is to think the stack is a rod and the data are discs. You can only take out the last one you put in.
@@ -66,9 +68,9 @@ As you can see if we add new items they will be the first to go out to honor LIF
6668

6769
Implementing the stack with an array and linked list would lead to the same time complexity:
6870

69-
.Time complexity for the stack operations
71+
.Time/Space complexity for the stack operations
7072
|===
71-
.2+.^s| Data Structure 2+^s| Searching By 3+^s| Inserting at the 3+^s| Deleting from .2+.^s| Space Complexity
73+
.2+.^s| Data Structure 2+^s| Searching By 3+^s| Inserting at the 3+^s| Deleting from .2+.^s| Space
7274
^|_Index/Key_ ^|_Value_ ^|_beginning_ ^|_middle_ ^|_end_ ^|_beginning_ ^|_middle_ ^|_end_
7375
| Stack ^|- ^|- ^|- ^|- ^|O(1) ^|- ^|- ^|O(1) ^|O(n)
7476
|===

‎src/data-structures/linked-lists/linked-list.js

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -148,7 +148,7 @@ class LinkedList {
148148

149149
// tag::find[]
150150
/**
151-
* Iterate through the list until callback returns thruthy
151+
* Iterate through the list until callback returns a truthy value
152152
* @example see #get and #indexOf
153153
* @param {Function} callback evaluates current node and index.
154154
* If any value other than undefined it's returned it will stop the search.

0 commit comments

Comments
 (0)
Please sign in to comment.