From 70dcea0030d433f236479ac248f5db132c5a478c Mon Sep 17 00:00:00 2001 From: RichardScottOZ <72196131+RichardScottOZ@users.noreply.github.com> Date: Thu, 11 Mar 2021 19:49:35 +1030 Subject: [PATCH] Update vae-talk.ipynb minor edits --- vae-talk.ipynb | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/vae-talk.ipynb b/vae-talk.ipynb index 74b23d5..2335919 100644 --- a/vae-talk.ipynb +++ b/vae-talk.ipynb @@ -11,7 +11,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "What are autoecnoders?\n", + "What are autoencoders?\n", "![title](img/autoencoder_schema.jpg)\n", "\"Autoencoding\" is a data compression algorithm where the compression and decompression functions are 1) data-specific, 2) lossy, and 3) learned automatically from examples rather than engineered by a human. Additionally, in almost all contexts where the term \"autoencoder\" is used, the compression and decompression functions are implemented with neural networks." ] @@ -57,7 +57,7 @@ "1. data denoising, \n", "2. non linear dimensionality reduction for data visualization. With appropriate dimensionality and sparsity constraints,autoencoders can learn data projections that are more interesting than PCA or other basic techniques.\n", "3. Reinforcement Learning\n", - "4. Seemi-Supervised Learning, we have lots of unlabeled data and some labeled data. " + "4. Semi-Supervised Learning, we have lots of unlabeled data and some labeled data. " ] }, {