GAN Lecture 2
Conditional Generation by GAN

Algorithm
In each traing iteration:
- Sample m positive examples {(c1,x1),(c2,x2),…,(cm,xm)} from database
- Sample m noise samples {z1,z2,…,zm} from a distribution
- Obtaining generated data {x~1,x~2,…,x~m}, x~i=G(ci,zi)
- Sample m objects {x^1,x^2,…,x^m} from database
- Update discriminator parameters θd to maximize
- V~=m1∑i=1mlogD(ci,xi)+m1∑i=1mlog(1−D(ci,x~i))+m1i=1mlog(1−D(ci,x^i))
- θd←θd+η▽V~(θd)
Learning D
- Sample m noise samples {z1,z2,…,zm} from a distribution
- Sample m conditions {c1,c2,…,cm} from a database
- Update generator parameters θg to maximize
-
V~=m1∑i=1mlog(D(G(ci,zi))), θg←η▽V~(θg)
Learning G

倾向推荐第二种网络架构
参考文献:StackGAN

参考文献:Patch GAN


参考例子:Github
相关文章:
-
2021-08-02
-
2021-11-29
-
2021-08-14
-
2021-09-02
-
2021-07-27
-
2021-12-19
-
2021-12-22
-
2021-04-05