国产精品天干天干,亚洲毛片在线,日韩gay小鲜肉啪啪18禁,女同Gay自慰喷水

歡迎光臨散文網(wǎng) 會(huì)員登陸 & 注冊(cè)

吳恩達(dá)DeepLearning.ai之生成對(duì)抗網(wǎng)絡(luò)(GANS)專業(yè)化〔Andre

2023-02-22 22:28 作者:聽聽我的腦洞  | 我要投稿

Generative models

  • Variational Autoencoders
  • Encoder-Latent Space-Decoder
  • Generative Adversarial Networks
  • Generator: learn to produce realistic examples
  • Discriminator: distinguish between fake and real

?
1.6discriminator P6 - 00:19
?

Discriminator

basically, a neural networks classifier

compare the Y^hat with Label Y.

P(Y|X)



?
1.7generator P7 - 00:01
?

Generator

Use noises, put noises into generator, and get X^\hat, then put X^\hat into discriminator, to Y^\hat_d.

Use the Y^\hat the difference between... to update the parameters of generator.

save the parameters





P(X|Y), probability of features given class Y.

Since, we only care about one specific label each time, so it is

P(X|Y)=P(X) here. we can ignore the Y here.

?
1.8bce-cost-function P8 - 00:02
?

BCE cost function.

BCE stands for Binary Cross Entropy Loss equation by parts.

summary over the entire batch, where the summation from i=1 to m indicates


h means the prediction?

y is the label,

theta is the parameter,

x is the features.



?
1.8bce-cost-function P8 - 01:54
?


first item:

if the true y is fake, the value of y is 0.

Then, no matter what the prediction is, the first item is 0.

If the true y is real, then if the prediction has a high probability say 0.99 to be real, the value of the first item is going to be 0.

However, if the prediction is close to 0, then the product of first item is going to be negative infinity.

Hence, negative infinity here indicates bad result.

if the prediction is good,

if the prediction is bad, it goes to -\infinity

second item:


If the prediction is really bad, the value goes to negative infinity.

similarly, negative infinity indicates bad prediction.



?
1.9putting-it-all-together P9 - 00:15
?


for discriminator, pass X^\hat and real X into the discriminator, then BCE.

update the \theta_d (parameter for the discriminator)

want to know the difference between fake and real

generator want the fake things to be as real as possible.



?
2.2activations-basic-properties P12 - 00:07
?

activations

non-linear differential


?
2.3common-activation-functions P13 - 00:25
?


ReLu -dying ReLU when it is negative, it is always 0. loss the information.

Leaky ReLU solves the problem

max(az,z)

a = 0.1

so it is not compare to 0, but a small value.

Sigmoid/Tanh-- vanish gradient and saturation problem



?
2.4batch-normalization-explained P14 - 04:11
?

batch normalization reduce the the covariate shift.

easier to train and speed the training process.

?
2.5batch-normalization-procedure P15 - 02:43
?

norm for training

norm for test fixed values.


?
3.2mode-collapse P21 - 00:40
?




10 modes for numbers


it will converge to 1 mode..that's the problem.



?
3.3problem-with-bce-loss P22 - 03:06
?

vanishing gradients


?
3.4earth-movers-distance P23 - 01:11
?



?
3.5wasserstein-loss P24 - 00:03
?





?
3.6condition-on-wasserstein-critic P25 - 00:13
?




?
3.7 1-lipschitz-continuity-enforcemen P26 - 00:19
?






?
4.2conditional-generation-intuition P28 - 02:05
?



?
4.3conditional-generation-inputs P29 - 02:02
?



?
4.4controllable-generation P30 - 00:19
?

controllable generation control some of the features ...


?
4.5vector-algebra-in-the-z-space P31 - 00:49
?




?
4.6challenges-with-controllable-gener P32 - 01:19
?



?
4.7classifier-gradients P33 - 01:05
?


take advantage of pre-trained classifier.

?
4.8disentanglement P34 - 02:24
?


?
4.8disentanglement P34 - 04:22
?



吳恩達(dá)DeepLearning.ai之生成對(duì)抗網(wǎng)絡(luò)(GANS)專業(yè)化〔Andre的評(píng)論 (共 條)

分享到微博請(qǐng)遵守國(guó)家法律
上思县| 甘德县| 密山市| 巴楚县| 米林县| 黑河市| 思茅市| 刚察县| 广宗县| 喜德县| 丰原市| 漳州市| 利津县| 聊城市| 公安县| 湘乡市| 嘉定区| 民丰县| 彭山县| 新昌县| 灵寿县| 漳州市| 太谷县| 彩票| 察隅县| 郧西县| 崇州市| 平昌县| 咸阳市| 徐州市| 青河县| 车致| 南涧| 芮城县| 拉孜县| 黔西| 台东市| 建湖县| 西充县| 遵义市| 富顺县|