Skip to content

Commit 3174dac

Browse files
committed
Added my conclusions for now
1 parent 4dc4700 commit 3174dac

File tree

1 file changed

+18
-0
lines changed

1 file changed

+18
-0
lines changed

deep-learning/Models_Algorithms_More/Which Optimizer Really Kicks Ass - Adam, SGD, RMSprop or Momentum.ipynb

+18
Original file line numberDiff line numberDiff line change
@@ -1579,6 +1579,24 @@
15791579
"plt.show()"
15801580
]
15811581
},
1582+
{
1583+
"cell_type": "markdown",
1584+
"metadata": {},
1585+
"source": [
1586+
"## My observations:\n",
1587+
"\n",
1588+
"- Adam,Nadam have the lowest training loss but you see that SGD & RMSprop come a close second!\n",
1589+
"- SGD+ Nest performs rather poorly above though!\n",
1590+
"- Adam still wins hands down in Val training as well\n",
1591+
"\n",
1592+
"**Concluding (for now)**\n",
1593+
"There's stil la lot to explore but we can see that Adam is kicking ass.\n",
1594+
"\n",
1595+
"We basically used a simple FFN (Feed forward neural network) with just three hidden layer, next experiment I'll try with 6 hidden layers.\n",
1596+
"\n",
1597+
"So, stay tuned!"
1598+
]
1599+
},
15821600
{
15831601
"cell_type": "code",
15841602
"execution_count": null,

0 commit comments

Comments
 (0)