|
7 | 7 |
|
8 | 8 | <meta name="twitter:card" content="summary"/> |
9 | 9 | <meta name="twitter:image:src" content="https://avatars1.githubusercontent.com/u/64068543?s=400&v=4"/> |
10 | | - <meta name="twitter:title" content="Distilling the Knowledge in a Neural Network)"/> |
| 10 | + <meta name="twitter:title" content="Distilling the Knowledge in a Neural Network"/> |
11 | 11 | <meta name="twitter:description" content=""/> |
12 | 12 | <meta name="twitter:site" content="@labmlai"/> |
13 | 13 | <meta name="twitter:creator" content="@labmlai"/> |
14 | 14 |
|
15 | 15 | <meta property="og:url" content="https://nn.labml.ai/distillation/readme.html"/> |
16 | | - <meta property="og:title" content="Distilling the Knowledge in a Neural Network)"/> |
| 16 | + <meta property="og:title" content="Distilling the Knowledge in a Neural Network"/> |
17 | 17 | <meta property="og:image" content="https://avatars1.githubusercontent.com/u/64068543?s=400&v=4"/> |
18 | 18 | <meta property="og:site_name" content="LabML Neural Networks"/> |
19 | 19 | <meta property="og:type" content="object"/> |
20 | | - <meta property="og:title" content="Distilling the Knowledge in a Neural Network)"/> |
| 20 | + <meta property="og:title" content="Distilling the Knowledge in a Neural Network"/> |
21 | 21 | <meta property="og:description" content=""/> |
22 | 22 |
|
23 | | - <title>Distilling the Knowledge in a Neural Network)</title> |
| 23 | + <title>Distilling the Knowledge in a Neural Network</title> |
24 | 24 | <link rel="shortcut icon" href="/icon.png"/> |
25 | 25 | <link rel="stylesheet" href="../pylit.css"> |
26 | 26 | <link rel="canonical" href="https://nn.labml.ai/distillation/readme.html"/> |
|
66 | 66 | <div class='section-link'> |
67 | 67 | <a href='#section-0'>#</a> |
68 | 68 | </div> |
69 | | - <h1><a href="(https://nn.labml.ai/distillation/index.html)">Distilling the Knowledge in a Neural Network</a></h1> |
| 69 | + <h1><a href="https://nn.labml.ai/distillation/index.html">Distilling the Knowledge in a Neural Network</a></h1> |
70 | 70 | <p>This is a <a href="https://pytorch.org">PyTorch</a> implementation/tutorial of the paper |
71 | 71 | <a href="https://papers.labml.ai/paper/1503.02531">Distilling the Knowledge in a Neural Network</a>.</p> |
72 | 72 | <p>It’s a way of training a small network using the knowledge in a trained larger network; |
|
0 commit comments