国产精品天干天干,亚洲毛片在线,日韩gay小鲜肉啪啪18禁,女同Gay自慰喷水

歡迎光臨散文網(wǎng) 會員登陸 & 注冊

PyTorch Tutorial 05 - Gradient Descen...

2023-02-15 10:18 作者:Mr-南喬  | 我要投稿

教程Python代碼:numpy版


import numpy as np


# f = w * x 此處不加偏置


# f = 2 * x

X = np.array([1,2,3,4],dtype=np.float32)

Y = np.array([2,4,6,8],dtype=np.float32)


# 初始化權(quán)重

w = 0.0


# model prediction,計算模型

def forward(x):

return w * x


# loss = MSE(Mean Square Error),均方誤差計算損失

def loss(y,y_predicted):

return ((y_predicted - y)**2).mean()


# gradient,手動計算損失的梯度

# MSE = 1/N * (w*x - y)**2

# dJ/dw = 1/N * 2x * (w*x - y) , 這是數(shù)值計算的計算導(dǎo)數(shù)

def gradient(x,y,y_predicted):

return np.dot(2*x, y_predicted-y).mean()


print(f'Prediction befor training: f(5) = {forward(5):.3f}')


# Training

learning_rate = 0.01 #學(xué)習(xí)率

n_iters = 20 #多次迭代


for epoch in range(n_iters):

# prediction = forward pass

y_pred = forward(X)


# loss

l = loss(Y,y_pred)


# gradients

dw = gradient(X, Y, y_pred)


# update weights 更新公式:權(quán)重 = 權(quán)重 - (步長或?qū)W習(xí)速率 * dw)

w -= learning_rate * dw


#打印每一步

if epoch % 1 == 0:

print(f'epoch {epoch+1}: w = {w:.3f}, loss = {l:.8f}')


print(f'Prediction after training: f(5) = {forward(5):.3f}')

PyTorch Tutorial 05 - Gradient Descen...的評論 (共 條)

分享到微博請遵守國家法律
成都市| 法库县| 腾冲县| 株洲市| 客服| 新田县| 江都市| 祁阳县| 葫芦岛市| 清涧县| 府谷县| 奇台县| 高尔夫| 秭归县| 集安市| 芷江| 夹江县| 阳新县| 梨树县| 新津县| 敦煌市| 武城县| 海兴县| 安顺市| 灌云县| 霍林郭勒市| 建昌县| 岱山县| 普兰店市| 石台县| 高青县| 绥棱县| 西青区| 深州市| 隆安县| 阿勒泰市| 晴隆县| 清河县| 漳浦县| 长白| 都匀市|