Skip to content

Commit 90c713a

Browse files
committed
update html
1 parent 86356e4 commit 90c713a

File tree

319 files changed

+10564
-14873
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

319 files changed

+10564
-14873
lines changed
Binary file not shown.

docs/.doctrees/environment.pickle

1.83 KB
Binary file not shown.

docs/.doctrees/intro.doctree

42 Bytes
Binary file not shown.

docs/.doctrees/nbsphinx/Module3_IntroducingNumpy/AutoDiff.ipynb

Lines changed: 67 additions & 67 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
"cells": [
33
{
44
"cell_type": "raw",
5-
"id": "5f3c51d6",
5+
"id": "d29bfcfc",
66
"metadata": {
77
"raw_mimetype": "text/restructuredtext"
88
},
@@ -13,7 +13,7 @@
1313
},
1414
{
1515
"cell_type": "markdown",
16-
"id": "fa519b0e",
16+
"id": "8371f87e",
1717
"metadata": {},
1818
"source": [
1919
"<div class=\"alert alert-info\">\n",
@@ -27,15 +27,15 @@
2727
},
2828
{
2929
"cell_type": "markdown",
30-
"id": "b36fff16",
30+
"id": "18124dc8",
3131
"metadata": {},
3232
"source": [
3333
"# Automatic Differentiation"
3434
]
3535
},
3636
{
3737
"cell_type": "markdown",
38-
"id": "6a5e2c13",
38+
"id": "52b02dcc",
3939
"metadata": {
4040
"lines_to_next_cell": 2
4141
},
@@ -73,15 +73,15 @@
7373
},
7474
{
7575
"cell_type": "markdown",
76-
"id": "f49ecb98",
76+
"id": "391e0110",
7777
"metadata": {},
7878
"source": [
7979
"## Introduction to MyGrad"
8080
]
8181
},
8282
{
8383
"cell_type": "markdown",
84-
"id": "208bef8f",
84+
"id": "a402adf5",
8585
"metadata": {},
8686
"source": [
8787
"\n",
@@ -146,23 +146,23 @@
146146
},
147147
{
148148
"cell_type": "markdown",
149-
"id": "9159288c",
149+
"id": "3989dfa1",
150150
"metadata": {},
151151
"source": [
152152
"It is important to reiterate that MyGrad *never gives us the actual function* $\\frac{\\mathrm{d}f}{\\mathrm{d}x}$; it only computes the derivative evaluated at a specific input $x=10$."
153153
]
154154
},
155155
{
156156
"cell_type": "markdown",
157-
"id": "8e505337",
157+
"id": "0eb133b2",
158158
"metadata": {},
159159
"source": [
160160
"### MyGrad Adds \"Drop-In\" AutoDiff to NumPy\n"
161161
]
162162
},
163163
{
164164
"cell_type": "markdown",
165-
"id": "175e840c",
165+
"id": "0abe8175",
166166
"metadata": {},
167167
"source": [
168168
"\n",
@@ -186,7 +186,7 @@
186186
},
187187
{
188188
"cell_type": "markdown",
189-
"id": "e397f16c",
189+
"id": "7f737971",
190190
"metadata": {
191191
"lines_to_next_cell": 2
192192
},
@@ -218,15 +218,15 @@
218218
},
219219
{
220220
"cell_type": "markdown",
221-
"id": "fdaebad6",
221+
"id": "7352acee",
222222
"metadata": {},
223223
"source": [
224224
"## Vectorized Auto-Differentiation"
225225
]
226226
},
227227
{
228228
"cell_type": "markdown",
229-
"id": "ac75b5a9",
229+
"id": "f2614315",
230230
"metadata": {},
231231
"source": [
232232
"Like NumPy's array, MyGrad's tensor supports [vectorized operations](https://www.pythonlikeyoumeanit.com/Module3_IntroducingNumpy/VectorizedOperations.html), allowing us to evaluate the derivative of a function at multiple points simultaneously.\n",
@@ -273,15 +273,15 @@
273273
},
274274
{
275275
"cell_type": "markdown",
276-
"id": "4ba915d9",
276+
"id": "da65df83",
277277
"metadata": {},
278278
"source": [
279279
"<div class=\"alert alert-info\">"
280280
]
281281
},
282282
{
283283
"cell_type": "markdown",
284-
"id": "769f3243",
284+
"id": "0a704f14",
285285
"metadata": {},
286286
"source": [
287287
"## Visualizing the Derivative\n",
@@ -296,13 +296,13 @@
296296
{
297297
"cell_type": "code",
298298
"execution_count": 1,
299-
"id": "ea11710e",
299+
"id": "3962a57c",
300300
"metadata": {
301301
"execution": {
302-
"iopub.execute_input": "2022-01-30T01:36:28.789638Z",
303-
"iopub.status.busy": "2022-01-30T01:36:28.787641Z",
304-
"iopub.status.idle": "2022-01-30T01:36:30.944058Z",
305-
"shell.execute_reply": "2022-01-30T01:36:30.944058Z"
302+
"iopub.execute_input": "2022-01-30T17:58:48.993757Z",
303+
"iopub.status.busy": "2022-01-30T17:58:48.990755Z",
304+
"iopub.status.idle": "2022-01-30T17:58:51.023711Z",
305+
"shell.execute_reply": "2022-01-30T17:58:51.022685Z"
306306
}
307307
},
308308
"outputs": [
@@ -353,15 +353,15 @@
353353
},
354354
{
355355
"cell_type": "markdown",
356-
"id": "32f3b8ef",
356+
"id": "d9b0203f",
357357
"metadata": {},
358358
"source": [
359359
"## Seek and Derive"
360360
]
361361
},
362362
{
363363
"cell_type": "markdown",
364-
"id": "e0fa9dee",
364+
"id": "5811b985",
365365
"metadata": {},
366366
"source": [
367367
"Computers equipped with automatic differentiation libraries can make short work of derivatives that are well-beyond the reach of mere mortals.\n",
@@ -397,7 +397,7 @@
397397
},
398398
{
399399
"cell_type": "markdown",
400-
"id": "74da8249",
400+
"id": "549bf18c",
401401
"metadata": {},
402402
"source": [
403403
"<div class=\"alert alert-info\">\n",
@@ -418,7 +418,7 @@
418418
},
419419
{
420420
"cell_type": "markdown",
421-
"id": "9af7bf0d",
421+
"id": "b00caa03",
422422
"metadata": {
423423
"lines_to_next_cell": 0
424424
},
@@ -428,7 +428,7 @@
428428
},
429429
{
430430
"cell_type": "markdown",
431-
"id": "e4cd5906",
431+
"id": "184b9ffc",
432432
"metadata": {},
433433
"source": [
434434
"\n",
@@ -447,7 +447,7 @@
447447
"Let's take a simple example.\n",
448448
"We'll choose the function $f(x) = (x-8)^2$ and the starting point $x=-1.5$.\n",
449449
"As we search for $x_\\mathrm{min}$ we don't want to make our updates to $x_o$\n",
450-
"too big, so we will scale our updates by a factor of $3/10$ (which is somewhat haphazardly here).\n",
450+
"too big, so we will scale our updates by a factor of $3/10$ (the value of which is chosen somewhat haphazardly here).\n",
451451
"\n",
452452
"```python\n",
453453
"# Performing gradient descent on f(x) = (x - 8) ** 2\n",
@@ -491,15 +491,15 @@
491491
},
492492
{
493493
"cell_type": "markdown",
494-
"id": "44c65efd",
494+
"id": "3188cd27",
495495
"metadata": {},
496496
"source": [
497497
"## Reading Comprehension Exercise Solutions"
498498
]
499499
},
500500
{
501501
"cell_type": "markdown",
502-
"id": "e2167a40",
502+
"id": "92d9cc26",
503503
"metadata": {},
504504
"source": [
505505
"**Auto-differentiation: Solution**"
@@ -508,13 +508,13 @@
508508
{
509509
"cell_type": "code",
510510
"execution_count": 2,
511-
"id": "15448e32",
511+
"id": "4ec64b05",
512512
"metadata": {
513513
"execution": {
514-
"iopub.execute_input": "2022-01-30T01:36:30.952052Z",
515-
"iopub.status.busy": "2022-01-30T01:36:30.951053Z",
516-
"iopub.status.idle": "2022-01-30T01:36:30.960051Z",
517-
"shell.execute_reply": "2022-01-30T01:36:30.960051Z"
514+
"iopub.execute_input": "2022-01-30T17:58:51.032677Z",
515+
"iopub.status.busy": "2022-01-30T17:58:51.031680Z",
516+
"iopub.status.idle": "2022-01-30T17:58:51.054848Z",
517+
"shell.execute_reply": "2022-01-30T17:58:51.053809Z"
518518
}
519519
},
520520
"outputs": [
@@ -548,13 +548,13 @@
548548
{
549549
"cell_type": "code",
550550
"execution_count": 3,
551-
"id": "79c175e2",
551+
"id": "a150ad66",
552552
"metadata": {
553553
"execution": {
554-
"iopub.execute_input": "2022-01-30T01:36:31.019052Z",
555-
"iopub.status.busy": "2022-01-30T01:36:30.968050Z",
556-
"iopub.status.idle": "2022-01-30T01:36:31.213057Z",
557-
"shell.execute_reply": "2022-01-30T01:36:31.214051Z"
554+
"iopub.execute_input": "2022-01-30T17:58:51.128812Z",
555+
"iopub.status.busy": "2022-01-30T17:58:51.125813Z",
556+
"iopub.status.idle": "2022-01-30T17:58:51.310845Z",
557+
"shell.execute_reply": "2022-01-30T17:58:51.311817Z"
558558
}
559559
},
560560
"outputs": [
@@ -578,13 +578,13 @@
578578
{
579579
"cell_type": "code",
580580
"execution_count": 4,
581-
"id": "d6e7e5a1",
581+
"id": "602820b0",
582582
"metadata": {
583583
"execution": {
584-
"iopub.execute_input": "2022-01-30T01:36:31.225052Z",
585-
"iopub.status.busy": "2022-01-30T01:36:31.221052Z",
586-
"iopub.status.idle": "2022-01-30T01:36:31.229053Z",
587-
"shell.execute_reply": "2022-01-30T01:36:31.230053Z"
584+
"iopub.execute_input": "2022-01-30T17:58:51.318815Z",
585+
"iopub.status.busy": "2022-01-30T17:58:51.317817Z",
586+
"iopub.status.idle": "2022-01-30T17:58:51.326845Z",
587+
"shell.execute_reply": "2022-01-30T17:58:51.327816Z"
588588
}
589589
},
590590
"outputs": [
@@ -618,13 +618,13 @@
618618
{
619619
"cell_type": "code",
620620
"execution_count": 5,
621-
"id": "d3e2ab46",
621+
"id": "eaa3d402",
622622
"metadata": {
623623
"execution": {
624-
"iopub.execute_input": "2022-01-30T01:36:31.236056Z",
625-
"iopub.status.busy": "2022-01-30T01:36:31.235085Z",
626-
"iopub.status.idle": "2022-01-30T01:36:31.455059Z",
627-
"shell.execute_reply": "2022-01-30T01:36:31.453052Z"
624+
"iopub.execute_input": "2022-01-30T17:58:51.334817Z",
625+
"iopub.status.busy": "2022-01-30T17:58:51.332813Z",
626+
"iopub.status.idle": "2022-01-30T17:58:51.567820Z",
627+
"shell.execute_reply": "2022-01-30T17:58:51.570825Z"
628628
}
629629
},
630630
"outputs": [
@@ -648,13 +648,13 @@
648648
{
649649
"cell_type": "code",
650650
"execution_count": 6,
651-
"id": "8de8439e",
651+
"id": "887cd4b1",
652652
"metadata": {
653653
"execution": {
654-
"iopub.execute_input": "2022-01-30T01:36:31.468053Z",
655-
"iopub.status.busy": "2022-01-30T01:36:31.467052Z",
656-
"iopub.status.idle": "2022-01-30T01:36:31.482078Z",
657-
"shell.execute_reply": "2022-01-30T01:36:31.483106Z"
654+
"iopub.execute_input": "2022-01-30T17:58:51.588820Z",
655+
"iopub.status.busy": "2022-01-30T17:58:51.587824Z",
656+
"iopub.status.idle": "2022-01-30T17:58:51.621816Z",
657+
"shell.execute_reply": "2022-01-30T17:58:51.623839Z"
658658
}
659659
},
660660
"outputs": [
@@ -688,13 +688,13 @@
688688
{
689689
"cell_type": "code",
690690
"execution_count": 7,
691-
"id": "c3eec605",
691+
"id": "693a1ee1",
692692
"metadata": {
693693
"execution": {
694-
"iopub.execute_input": "2022-01-30T01:36:31.511078Z",
695-
"iopub.status.busy": "2022-01-30T01:36:31.510073Z",
696-
"iopub.status.idle": "2022-01-30T01:36:31.703106Z",
697-
"shell.execute_reply": "2022-01-30T01:36:31.703106Z"
694+
"iopub.execute_input": "2022-01-30T17:58:51.649818Z",
695+
"iopub.status.busy": "2022-01-30T17:58:51.646839Z",
696+
"iopub.status.idle": "2022-01-30T17:58:52.483426Z",
697+
"shell.execute_reply": "2022-01-30T17:58:52.484422Z"
698698
}
699699
},
700700
"outputs": [
@@ -718,13 +718,13 @@
718718
{
719719
"cell_type": "code",
720720
"execution_count": 8,
721-
"id": "2ce01084",
721+
"id": "8859b9f9",
722722
"metadata": {
723723
"execution": {
724-
"iopub.execute_input": "2022-01-30T01:36:31.710075Z",
725-
"iopub.status.busy": "2022-01-30T01:36:31.709073Z",
726-
"iopub.status.idle": "2022-01-30T01:36:31.719076Z",
727-
"shell.execute_reply": "2022-01-30T01:36:31.719076Z"
724+
"iopub.execute_input": "2022-01-30T17:58:52.495446Z",
725+
"iopub.status.busy": "2022-01-30T17:58:52.492440Z",
726+
"iopub.status.idle": "2022-01-30T17:58:52.516422Z",
727+
"shell.execute_reply": "2022-01-30T17:58:52.515423Z"
728728
}
729729
},
730730
"outputs": [
@@ -758,13 +758,13 @@
758758
{
759759
"cell_type": "code",
760760
"execution_count": 9,
761-
"id": "701c3f29",
761+
"id": "6eb1cf09",
762762
"metadata": {
763763
"execution": {
764-
"iopub.execute_input": "2022-01-30T01:36:31.734075Z",
765-
"iopub.status.busy": "2022-01-30T01:36:31.733075Z",
766-
"iopub.status.idle": "2022-01-30T01:36:31.961828Z",
767-
"shell.execute_reply": "2022-01-30T01:36:31.962830Z"
764+
"iopub.execute_input": "2022-01-30T17:58:52.528432Z",
765+
"iopub.status.busy": "2022-01-30T17:58:52.526426Z",
766+
"iopub.status.idle": "2022-01-30T17:58:53.033072Z",
767+
"shell.execute_reply": "2022-01-30T17:58:53.032071Z"
768768
}
769769
},
770770
"outputs": [

docs/Module3_IntroducingNumpy/AutoDiff.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -594,7 +594,7 @@ <h2>Applying Automatic Differentiation: Solving Optimization Problems<a class="h
594594
<p>How does automatic differentiation help us to solve such a problem? The derivative of a function evaluated at some <span class="math notranslate nohighlight">\(x_o\)</span> tells us the slope of the function – whether it is decreasing or increasing – at <span class="math notranslate nohighlight">\(x_o\)</span>. This is certainly useful information for helping us search for <span class="math notranslate nohighlight">\(x_\mathrm{min}\)</span>: always look in the direction of decreasing slope, until the slope goes to <span class="math notranslate nohighlight">\(0\)</span>.</p>
595595
<p>We start our search for <span class="math notranslate nohighlight">\(x_{\mathrm{min}}\)</span> by picking a random starting for value for <span class="math notranslate nohighlight">\(x_o\)</span>, use the autodiff library to compute <span class="math notranslate nohighlight">\(\frac{\mathrm{d}f}{\mathrm{d}x}\big|_{x=x_{o}}\)</span> and then use that information to “step” <span class="math notranslate nohighlight">\(x_o\)</span> in the direction that “descends” <span class="math notranslate nohighlight">\(f(x)\)</span>. We repeat this process until we see that <span class="math notranslate nohighlight">\(\frac{\mathrm{d}f}{\mathrm{d}x}\big|_{x=x_{o}} \approx 0\)</span>. It must be noted that this approach towards finding <span class="math notranslate nohighlight">\(x_\mathrm{min}\)</span> is highly limited;
596596
saddle-points can stop us in our tracks, and we will only be able to find <em>local</em> minima with this strategy. Nonetheless, it is still very useful!</p>
597-
<p>Let’s take a simple example. We’ll choose the function <span class="math notranslate nohighlight">\(f(x) = (x-8)^2\)</span> and the starting point <span class="math notranslate nohighlight">\(x=-1.5\)</span>. As we search for <span class="math notranslate nohighlight">\(x_\mathrm{min}\)</span> we don’t want to make our updates to <span class="math notranslate nohighlight">\(x_o\)</span> too big, so we will scale our updates by a factor of <span class="math notranslate nohighlight">\(3/10\)</span> (which is somewhat haphazardly here).</p>
597+
<p>Let’s take a simple example. We’ll choose the function <span class="math notranslate nohighlight">\(f(x) = (x-8)^2\)</span> and the starting point <span class="math notranslate nohighlight">\(x=-1.5\)</span>. As we search for <span class="math notranslate nohighlight">\(x_\mathrm{min}\)</span> we don’t want to make our updates to <span class="math notranslate nohighlight">\(x_o\)</span> too big, so we will scale our updates by a factor of <span class="math notranslate nohighlight">\(3/10\)</span> (the value of which is chosen somewhat haphazardly here).</p>
598598
<div class="highlight-python notranslate"><div class="highlight"><pre><span></span><span class="c1"># Performing gradient descent on f(x) = (x - 8) ** 2</span>
599599
<span class="n">x</span> <span class="o">=</span> <span class="n">mg</span><span class="o">.</span><span class="n">Tensor</span><span class="p">(</span><span class="o">-</span><span class="mf">1.5</span><span class="p">)</span>
600600
<span class="n">step_scale</span> <span class="o">=</span> <span class="mf">0.3</span>

0 commit comments

Comments
 (0)