Эх сурвалжийг харах

Minor correction in readme directions

Jason Antic 6 жил өмнө
parent
commit
e4a64b16ab
1 өөрчлөгдсөн 1 нэмэгдсэн , 1 устгасан
  1. 1 1
      README.md

+ 1 - 1
README.md

@@ -149,7 +149,7 @@ Except the generator is a **pretrained U-Net**, and I've just modified it to hav
 This is also very straightforward – it's just one to one generator/critic iterations and higher critic learning rate. This is modified to incorporate a "threshold" critic loss that makes sure that the critic is "caught up" before moving on to generator training.  This is particularly useful for the "NoGAN" method described below.
 
 #### **NoGAN**
-There's no paper here! This is a new type of GAN training that I've developed to solve some key problems in the previous DeOldify model. The gist is that you get the benefits of GAN training while spending minimal time doing direct GAN training.  More details are at the bottom of the readme (it's a doozy).
+There's no paper here! This is a new type of GAN training that I've developed to solve some key problems in the previous DeOldify model. The gist is that you get the benefits of GAN training while spending minimal time doing direct GAN training.  More details are in the "What is NoGAN???" section of the readme (it's a doozy).
 
 #### **Generator Loss**
 Loss during NoGAN learning is two parts:  One is a basic Perceptual Loss (or Feature Loss) based on VGG16 – this just biases the generator model to replicate the input image.  The second is the loss score from the critic.  For the curious – Perceptual Loss isn't sufficient by itself to produce good results.  It tends to just encourage a bunch of brown/green/blue – you know, cheating to the test, basically, which neural networks are really good at doing!  Key thing to realize here is that GANs essentially are learning the loss function for you – which is really one big step closer to toward the ideal that we're shooting for in machine learning.  And of course you generally get much better results when you get the machine to learn something you were previously hand coding.  That's certainly the case here.