You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Adipisicing occaecat qui amet sint officia ullamco anim proident eu. Et nostrud sint do nisi cupidatat aute ea laborum est Lorem elit est. Sit magna id aute elit tempor cillum consectetur fugiat. Labore aute ea dolore aliquip labore laborum cillum ullamco aliquip laborum exercitation dolore mollit ad.
3
+
In this section we are going to cover the basics about algorithms analysis. We are also going to discuss eight of the most commmon runtimes of algorithms.
Chances are you are reading this book because you want to write better and faster code.
4
4
How can you do that? Can you time how long it takes to run a program? Of course, you can!
5
5
[big]#⏱#
6
-
However, if you run the same program on a smart watch, cellphone or desktop computer it will give you different times.
6
+
However, if you run the same program on a smartwatch, cellphone or desktop computer, it will give you different very times.
7
7
8
8
image:image3.png[image,width=528,height=137]
9
9
10
10
Wouldn't it be great if we can compare algorithms regardless of the hardware where we run them?
11
11
That's what *time complexity* is for!
12
12
But why stop with the running time?
13
-
We could also compare the memory "used" by different algorithms and we called that *space complexity*.
13
+
We could also compare the memory "used" by different algorithms, and we called that *space complexity*.
14
14
15
15
.In this chapter you will learn:
16
-
-What’s the best way to measure your code performance.
17
-
-Learn how to use Big O notation to compare algorithms.
18
-
-How to use algorithms analysis to improve your programs speed.
16
+
-What’s the best way to measure your code performance.
17
+
-Learn how to use Big O notation to compare algorithms.
18
+
-How to use algorithms analysis to improve your programs speed.
19
19
20
-
Before going deeper, into space and time complexity, let's define what an algorithm is.
20
+
Before going deeper, into space and time complexity, let's cover the basics real quick.
21
21
22
22
== What are Algorithms?
23
23
24
-
Algorithms (as you might know) are steps of how to do some task. When you cook, you follow a recipe (or an algorithm) to prepare a dish. Let's say you want to prepare a pizza...
24
+
Algorithms (as you might know) are steps of how to do some task. When you cook, you follow a recipe (or an algorithm) to prepare a dish. Let's say you want to make a pizza.
25
25
26
26
.Example of an algorithm
27
27
//[source, js]
@@ -38,15 +38,17 @@ function bakePizza(dough, toppins = []) {
38
38
bakePizza(new Dough, ['ham', 'cheese']);
39
39
----
40
40
41
-
If you play a game, you are devising strategies (or an algorithm) to help you win. Likewise, algorithms in computers are a set of instructions used to solve a problem.
41
+
If you play a game, you are devising strategies (or algorithms) to help you win. Likewise, algorithms in computers are a set of instructions used to solve a problem.
42
42
43
-
TIP: Algorithms are instructions to perform a task.
43
+
TIP: Algorithms are instructions on how to perform a task.
44
44
45
45
== Comparing Algorithms
46
46
47
-
There are “good” algorithms and “bad” algorithms. The good ones are fast; the bad ones are slow. Slow algorithms cost more money and make some calculations impossible in our lifespan!
47
+
Not all algorithms are created equal. There are “good” and “bad” algorithms. The good ones are fast; the bad ones are slow. Slow algorithms cost more money to run. Inefficient algorithms could make some calculations impossible in our lifespan!
48
48
49
-
Just to give you a clearer picture how different algorithms perform as the input size grows.
49
+
Most algorithms are affected by the size of the input. Let's say you need to arrange numbers in ascending order. Sorting ten digits will naturally take much less time than sorting 2 million of them.
50
+
51
+
To give you a clearer picture of how different algorithms perform as the input size grows, take a look at the following table.
50
52
51
53
.Relationship between algorithm input size and time taken to complete
52
54
[cols=",,,,,",options="header",]
@@ -59,9 +61,9 @@ Just to give you a clearer picture how different algorithms perform as the input
59
61
|Find all permutations of a string |4 sec. |> vigintillion years |> centillion years |∞ |∞
You can really notice the difference between a good algorithm and bad with the sorting array elements examples: `merge-sort` vs `bubble sort`.
64
+
However, if you keep the input size constant, you can notice the difference between an efficient algorithm and a slow one. An excellent sorting algorithm is `mergesort` for instance, and inefficient algorithm for large inputs is `bubble sort`.
63
65
Organizing 1 million elements with merge sort takes 20 seconds while bubble sort takes 12 days, ouch!
64
-
The amazing thing is that both programs are measured on the same hardware with exactly the same data!
66
+
The amazing thing is that both programs are measured on the same hardware with the same data!
65
67
66
68
After completing this book, you are going to *think differently*.
67
69
You will be able to scale your programs while you are designing them.
@@ -72,36 +74,40 @@ Find bottlenecks of existing software and have an "algorithmic toolbox" to switc
72
74
The first step to improve your code performance is to measure it. As somebody said:
73
75
74
76
[quote, H. J. Harrington]
75
-
Measurement is the first step that leads to control and eventually to improvement. If you can’t measure something, you can’t understand it. If you can’t understand it, you can’t control it. If you can’t control it, you can’t improve it.
77
+
Measurement is the first step that leads to control and eventually to improvement. If you can’t measure something, you can’t understand it. If you can’t understand it, you can’t control it. If you can’t manage it, you can’t improve it.
76
78
77
-
In this section we are going to learn the basics to measuring our current code performance and compare it with others.
79
+
In this section, we are going to learn the basics of measuring our current code performance and compare it with other algorithms.
78
80
79
81
=== Calculating Time Complexity
80
82
81
-
Time complexity, in computer science, is a function that describes the amount of operations a program will execute given the size of the input `n`.
83
+
Time complexity, in computer science, is a function that describes the number of operations a program will execute given the size of the input `n`.
82
84
83
-
How do get a function that give us the amount of operations that will executed? Well, we count line by line and mind code inside loops. For instance, we have a function to find the minimum value on an array called `getMin`.
85
+
How do get a function that gives us the number of operations that will be executed? Well, we count line by line and mind code inside loops. Let's do an example to explain this point. For instance, we have a function to find the minimum value on an array called `getMin`.
84
86
85
-
.Translating lines of code to approximate number of operations
87
+
.Translating lines of code to an approximate number of operations
86
88
image:image4.png[Operations per line]
87
89
88
-
Assuming that each line of code is an operation, we get the following that the number of operations given the input size `n` is:
90
+
Assuming that each line of code is an operation, we get the following:
89
91
90
92
_3n + 3_
91
93
92
-
That means that if give an array of 3 elements e.g. `getMin([3, 2, 9])`, then it will execute around _3(3)+3 = 12_ operations. Of course, this is not exact. Line 12 is only executed if the condition on line 11 is met. As you might learn in the next section, we want to get the big picture and get rid of smaller terms in order to compare algorithms easier.
94
+
`n` = input size.
95
+
96
+
That means that if give an array of 3 elements e.g. `getMin([3, 2, 9])`, then it will execute around _3(3)+3 = 12_ operations. Of course, this is not for every case. For instance, Line 12 is only executed if the condition on line 11 is met. As you might learn in the next section, we want to get the big picture and get rid of smaller terms to compare algorithms easier.
93
97
94
98
== Space Complexity
95
99
96
-
Space complexity is similar to time complexity. However, instead of the count of operations executed it will be the amount of memory used additional to the input.
100
+
Space complexity is similar to time complexity. However, instead of the count of operations executed, it will account for the amount of memory used additionally to the input.
97
101
98
-
Calculating the *space complexity* we keep track of the “variables” and memory used. In the `getMin` example, we just create a single variable called `min`. So, the space complexity is 1. If we had to copy values to another array then the space complexity would be `n`.
102
+
For calculating the *space complexity* we keep track of the “variables” and memory used. In the `getMin` example, we just create a single variable called `min`. So, the space complexity is 1. On other algorithms, If we have to use an auxiliary array, then the space complexity would be `n`.
99
103
100
104
=== Simplifying Complexity with Asymptotic Analysis
101
105
102
-
Asymptotic analysis is the of functions when their inputs approaches infinity.
106
+
When we are comparing algorithms, we don't want to have complex expressions. What would you prefer comparing two algorithms like "3n^2^ + 7n" vs. "1000 n + 2000" or compare them as "n^2^ vs. n"? Well, that when the asymptotic analysis comes to the rescue.
103
107
104
-
In the previous example we analyzed `getMin` with an array of size 3, what happen size is 10 or 10k or a million?
108
+
Asymptotic analysis is the of functions when their inputs approach infinity.
109
+
110
+
In the previous example, we analyzed `getMin` with an array of size 3, what happen size is 10 or 10k or a million?
105
111
106
112
.Operations performed by an algorithm with a time complexity of 3n+3
107
113
[cols=",,",options="header",]
@@ -112,31 +118,32 @@ In the previous example we analyzed `getMin` with an array of size 3, what happe
112
118
|1M |3(1M)+3 |3,000,003
113
119
|===========================
114
120
115
-
As the input size n grows bigger and bigger then the expression _3n+3_ could be represented as _3n_ or even _n_. This might look like a stretch at first, but you will see that what matters the most is the order of the function rather than lesser terms and constants. Actually, there’s a notation called *Big O*, where O refers to the *order of the function*.
121
+
As the input size `n` grows bigger and bigger then the expression _3n + 3_ could be represented as _3n_ without loosing too much or even _n_. Dropping terms might look like a stretch at first, but you will see that what matters the most is the higher order terms of the function rather than lesser terms and constants. There’s a notation called *Big O*, where O refers to the *order of the function*.
116
122
117
-
If you have a program which run time is like
123
+
If you have a program which runs time is like
118
124
119
125
_7n^3^ + 3n^2^ + 5_
120
126
121
-
You can safely say that its run time is _n^3^_ since the others term will become less and less significant as the inputs grows bigger.
127
+
You can safely say that its run time is _n^3^_. The other terms will become less and less significant as the input grows bigger.
122
128
123
129
=== What is Big O Notation anyways?
124
130
125
-
Big O notation, only cares about the “biggest” terms in the time/space complexity. So, it combines what we learn about time and space complexity, asymptotic analysis and adds worst-case scenario.
131
+
Big O notation, only cares about the “biggest” terms in the time/space complexity. So, it combines what we learn about time and space complexity, asymptotic analysis and adds a worst-case scenario.
126
132
127
-
.All algorithms have 3 scenarios:
128
-
* Best-case scenario: the most favorable input where the program will take the least amount of operations to complete. E.g. array already sorted for a sorting algorithm.
129
-
* Average-case scenario: the most common the input comes. E.g. array items in random order for a sorting algorithm.
130
-
* Worst-case scenario: the inputs are arranged in such a way that cause the program to take the longest possible to complete the task. E.g. array items in reversed order for a sorting algorithm.
133
+
.All algorithms have three scenarios:
134
+
* Best-case scenario: the most favorable input arrange where the program will take the least amount of operations to complete. E.g., array already sorted is beneficial for some sorting algorithms.
135
+
* Average-case scenario: this is the most common case. E.g., array items in random order for a sorting algorithm.
136
+
* Worst-case scenario: the inputs are arranged in such a way that causes the program to take the longest to complete. E.g., array items in reversed order for some sorting algorithm will take the longest to run.
131
137
132
138
To sum up:
133
139
134
-
IMPORTANT: Big O only cares about the highest order of the run time function and the worst-case scenario.
135
-
There are many common notations like polynomial, _O(n^2^)_ like we saw in the getMin example; constant O(1) and many more that we are going to explore in the next chapter.
140
+
TIP: Big O only cares about the highest order of the run time function and the worst-case scenario.
141
+
142
+
WARNING: Don't drop terms that multiplying other terms. _O(n log n)_ is not equivalent to _O(n)_. However, _O(n + log n)_ is.
136
143
137
-
Again, time complexity is not a direct measure of how long a program takes to execute but rather how many operations it executes in function of the input. However, there’s a relationship between time and operations executed. This changes from hardware to hardware but it gives you an idea.
144
+
There are many common notations like polynomial, _O(n^2^)_ like we saw in the `getMin` example; constant O(1) and many more that we are going to explore in the next chapter.
138
145
139
-
Readers might not know what this O(n!) means…
146
+
Again, time complexity is not a direct measure of how long a program takes to execute but rather how many operations it performs in given the input size. Nevertheless, there’s a relationship between time complexity and clock time as we can see in the following table.
140
147
141
148
.How long an algorithm takes to run based on their time complexity and input size
142
149
[cols=",,,,,,",options="header",]
@@ -149,14 +156,14 @@ Readers might not know what this O(n!) means…
149
156
|1M |< 1 sec. |1 second |20 seconds |12 days |∞ |∞
This just an illustration since in a different hardware the times will be slightly different.
159
+
This just an illustration since in different hardware the times will be slightly different.
153
160
154
-
NOTE: These times are under the assumption of running on 1 GHz CPU and that it can execute on average one instruction in 1 nanosecond (usually takes more time). Also, bear in mind that each line might be translated into dozens of CPU instructions depending on the programming language. Regardless, bad algorithms still perform badly even in a super computer.
161
+
NOTE: These times are under the assumption of running on 1 GHz CPU and that it can execute on average one instruction in 1 nanosecond (usually takes more time). Also, bear in mind that each line might be translated into dozens of CPU instructions depending on the programming language. Regardless, bad algorithms would perform poorly even on a supercomputer.
155
162
156
163
== Summary
157
164
158
-
In this chapter we learned how you can measure your algorithm performance using time complexity. Rather than timing how long you program take to run you can approximate the number of operations it will perform based on the input size.
165
+
In this chapter, we learned how you could measure your algorithm performance using time complexity. Rather than timing how long your program take to run you can approximate the number of operations it will perform based on the input size.
159
166
160
-
We went thought the process of deducting the time complexity from a simple algorithm. We learned about time and space complexity and how they can be translated to Big O notation. Big O refers to the order of the function.
167
+
We learned about time and space complexity and how they can be translated to Big O notation. Big O refers to the *order* of the function.
161
168
162
169
In the next section, we are going to provide examples of each of the most common time complexities!
Ex nulla ex officia reprehenderit aliquip esse. Minim magna commodo fugiat occaecat qui. Esse reprehenderit cupidatat qui et ullamco amet cupidatat sunt pariatur laboris Lorem. Anim non aliquip duis est occaecat minim et eu proident.
3
+
The AVL tree builds on top of a <<Binary Search Tree>> and it keeps it balanced on insertions. It prevents a BST worst case scenario when the tree is totally unbalanced to one side (similar to linked list), then it takes O(n) to find an element instead of O(log n).
Copy file name to clipboardExpand all lines: book/chapters/colophon.adoc
+2-2Lines changed: 2 additions & 2 deletions
Original file line number
Diff line number
Diff line change
@@ -9,8 +9,8 @@ All rights reserved.
9
9
10
10
For online information and ordering this and other books, please visit https://adrianmejia.com. The publisher offers discounts on this book when ordered in quantity for more information contact sales@adrianmejia.com.
11
11
12
-
No part of this publication maybe produced, the store in the retrieval system, or transmitted, in any form or by mean electronic, mechanical, photocopying, or otherwise, without prior written permission of the publisher.
12
+
No part of this publication may be produced, the store in the retrieval system, or transmitted, in any form or by means electronic, mechanical, photocopying, or otherwise, without the prior written permission of the publisher.
13
13
14
-
While every precaution has been taking in the preparation of this book, the publisher and author assume no responsibility for errors or omissions, or for damages resulting from the use of the information contained herein.
14
+
While every precaution has been taking in the preparation of this book, the publisher and author assume no responsibility for errors or omissions, or damages resulting from the use of the information contained herein.
Voluptate consequat magna laborum consectetur fugiat deserunt. Id sit est ullamco magna sint laborum proident. Exercitation cupidatat exercitation excepteur ex pariatur qui qui sint amet consectetur laborum ex mollit dolore.
4
+
5
+
Et do sunt do labore culpa est eu ut fugiat eiusmod ea excepteur. Irure commodo adipisicing in aute aliquip laborum laboris reprehenderit incididunt in sunt. Cupidatat veniam est culpa ex eu aute voluptate tempor aliqua ullamco sunt et consectetur. Eu laboris mollit culpa consequat. Sunt mollit quis dolor nostrud. In duis mollit do adipisicing veniam do deserunt exercitation Lorem deserunt aliquip. Ea esse reprehenderit incididunt eu deserunt sit nulla sint non eiusmod nisi eu et irure.
6
+
7
+
Ad commodo anim nulla occaecat non. Aute fugiat laborum ut mollit exercitation aute proident reprehenderit culpa consectetur. Cillum officia laborum proident labore sunt est eiusmod proident. Lorem nostrud ea qui tempor culpa ullamco ipsum. Dolore nulla minim qui incididunt qui sint consectetur quis tempor esse minim. Do id consequat commodo sit officia aliqua officia reprehenderit eiusmod elit do amet.
Copy file name to clipboardExpand all lines: book/chapters/preface.adoc
+6-6Lines changed: 6 additions & 6 deletions
Original file line number
Diff line number
Diff line change
@@ -1,14 +1,14 @@
1
1
[preface]
2
2
= Preface
3
3
4
-
This book is intended for programmers who wants to go deeper into understanding the most common data structures and algorithms.
5
-
Even tough you can use them without knowing how they work, itgives you a tool for analyzing trade-offs. If something is slow you would know what changes to make and how to analyze the code for better performance.
4
+
This book is intended for programmers who want to go deeper into understanding the most common data structures and algorithms.
5
+
Even though you can use them without knowing how they work, it's handy to know when to use one over the other. This book gives you a tool for analyzing trade-offs. When something is slow, you would know how to analyze the code for better performance.
6
6
7
-
The concepts on this book can be applied to any programming language. However, instead of doing examples on pseudo-code we are going to use JavaScript to implement the examples. JavaScript is the lingua franca of the web and nowdays is growing its usages in the backend, IOT and others.
7
+
The concepts in this book can be applied to any programming language. However, instead of doing examples on pseudo-code we are going to use JavaScript to implement the examples. JavaScript is the lingua franca of the web and nowadays is growing its usages in the backend, IOT, and others.
8
8
9
-
The following admonitions are used to hightlight content
9
+
The following admonitions are used to highlight content
10
10
11
-
IMPORTANT: Reword important concepts. Good for memorizing, tweeting and sharing.
11
+
IMPORTANT: Reword essential concepts. Good for memorizing, tweeting and sharing.
0 commit comments