GAN Lecture 2


Conditional Generation by GAN

李宏毅GAN学习笔记(02)

Algorithm

In each traing iteration:

  • Sample m positive examples {(c1,x1),(c2,x2),,(cm,xm)}\{(c^1, x^1), (c^2, x^2), \dots, (c^m, x^m)\} from database
  • Sample m noise samples {z1,z2,,zm}\{z^1, z^2, \dots, z^m\} from a distribution
  • Obtaining generated data {x~1,x~2,,x~m}\{\tilde{x}^1, \tilde{x}^2, \dots, \tilde{x}^m\}, x~i=G(ci,zi)\tilde{x}^i=G(c^i, z^i)
  • Sample m objects {x^1,x^2,,x^m}\{\hat{x}^1, \hat{x}^2, \dots, \hat{x}^m\} from database
  • Update discriminator parameters θd\theta_d to maximize
    • V~=1mi=1mlogD(ci,xi)+1mi=1mlog(1D(ci,x~i))+1mi=1mlog(1D(ci,x^i))\tilde{V}=\frac{1}{m}\sum_{i=1}^mlogD(c^i, x^i)+\frac{1}{m}\sum_{i=1}^mlog(1-D(c^i, \tilde{x}^i))+\frac{1}{m}_{i=1}^mlog(1-D(c^i, \hat{x}^i))
    • θdθd+ηV~(θd)\theta_d \leftarrow \theta_d+\eta\bigtriangledown\tilde{V}(\theta_d)

Learning D

  • Sample m noise samples {z1,z2,,zm}\{z^1,z^2,\dots,z^m\} from a distribution
  • Sample m conditions {c1,c2,,cm}\{c^1,c^2,\dots,c^m\} from a database
  • Update generator parameters θg\theta_g to maximize
    • V~=1mi=1mlog(D(G(ci,zi)))\tilde{V}=\frac{1}{m}\sum_{i=1}^mlog(D(G(c^i, z^i))), θgηV~(θg)\theta_g \leftarrow\eta\bigtriangledown\tilde{V}(\theta_g)

Learning G

李宏毅GAN学习笔记(02)

倾向推荐第二种网络架构
参考文献:StackGAN

李宏毅GAN学习笔记(02)

参考文献:Patch GAN

李宏毅GAN学习笔记(02)

李宏毅GAN学习笔记(02)

参考例子:Github

相关文章:

  • 2021-08-02
  • 2021-11-29
  • 2021-08-14
  • 2021-09-02
  • 2021-07-27
  • 2021-12-19
  • 2021-12-22
  • 2021-04-05
猜你喜欢
  • 2021-06-06
  • 2021-06-14
  • 2021-04-24
  • 2021-07-20
  • 2021-04-29
  • 2021-11-28
  • 2021-11-19
相关资源
相似解决方案