国产精品天干天干,亚洲毛片在线,日韩gay小鲜肉啪啪18禁,女同Gay自慰喷水

歡迎光臨散文網(wǎng) 會(huì)員登陸 & 注冊

PyTorch Tutorial 08 - Logistic Regres...

2023-02-16 09:44 作者:Mr-南喬  | 我要投稿

教程Python代碼如下:


# 1) Design model(input, output size, forward pass)

# 2) Construct loss and optimizer

# 3) Training loop 訓(xùn)練循環(huán)

#??- forward pass: compute prediction

#??- backward pass: gradients

#??- update weights

import torch

import torch.nn as nn?# 神經(jīng)網(wǎng)絡(luò)模塊

import numpy as np?# 進(jìn)行數(shù)據(jù)轉(zhuǎn)換

from sklearn import datasets?# 加載二進(jìn)制分類數(shù)據(jù)集

from sklearn.preprocessing import StandardScaler?# 縮放features

from sklearn.model_selection import train_test_split?#將訓(xùn)練集和測試集分離開


"""二進(jìn)制分類問題,根據(jù)輸入數(shù)據(jù)特征,預(yù)測乳腺癌"""

# 0) prepare data

bc = datasets.load_breast_cancer()?# 乳腺癌數(shù)據(jù)集

X, y = bc.data, bc.target


n_samples, n_features = X.shape

#print(n_samples, n_features)

"""train_test_split函數(shù)用于將矩陣隨機(jī)劃分為訓(xùn)練子集和測試子集,并返回劃分好的訓(xùn)練集測試集樣本和訓(xùn)練集測試集標(biāo)簽"""

X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=1234)


# scale 縮放特征

sc = StandardScaler()

X_train = sc.fit_transform(X_train)

X_test = sc.transform(X_test)


X_train = torch.from_numpy(X_train.astype(np.float32))

X_test = torch.from_numpy(X_test.astype(np.float32))

y_train = torch.from_numpy(y_train.astype(np.float32))

y_test = torch.from_numpy(y_test.astype(np.float32))


# 重塑y_train

y_train = y_train.view(y_train.shape[0],1)?#現(xiàn)在y_train只有1行

y_test = y_test.view(y_test.shape[0],1)


# 1) model

"""f = wx + b, sigmoid at the end"""

class LogisticRegression(nn.Module):


??def __init__(self,n_input_features):

????super(LogisticRegression, self).__init__()

????self.linear = nn.Linear(n_input_features, 1)?#輸出僅一個(gè)值


??#前向傳遞,激活函數(shù)sigmoid

??def forward(self, x):

????y_predicted = torch.sigmoid(self.linear(x))

????return y_predicted


model = LogisticRegression(n_features)


# 2) loss and optimizer

learning_rate = 0.01

criterion = nn.BCELoss()?# 二分類交叉熵?fù)p失

optimizer = torch.optim.SGD(model.parameters(), lr=learning_rate)


# 3) training loop

num_epoch = 500

for epoch in range(num_epoch):

??# forward pass and loss

??y_predicted = model(X_train)

??loss = criterion(y_predicted,y_train)


??# backward pass

??loss.backward()


??# update weigths

??optimizer.step()


??# zero gradients

??optimizer.zero_grad()


??if(epoch+1) % 10 == 0:

????print(f'epoch: {epoch+1}, loss = {loss.item():.4f}')


# 評估模型

with torch.no_grad():

??y_predicted = model(X_test)

??y_predicted_cls = y_predicted.round()

??acc = y_predicted_cls.eq(y_test).sum() / float(y_test.shape[0])

??print(f'accuracy: {acc:.4f}')

PyTorch Tutorial 08 - Logistic Regres...的評論 (共 條)

分享到微博請遵守國家法律
丹寨县| 武宣县| 霍林郭勒市| 秭归县| 英德市| 盐山县| 东乌珠穆沁旗| 洛扎县| 敦煌市| 大姚县| 察隅县| 肥乡县| 正阳县| 梁平县| 仪陇县| 马公市| 永州市| 北流市| 肥乡县| 鄄城县| 宜宾市| 观塘区| 石棉县| 满城县| 大石桥市| 贵德县| 枣庄市| 镇雄县| 凌源市| 古丈县| 乌拉特中旗| 定南县| 新巴尔虎右旗| 英吉沙县| 若尔盖县| 阳西县| 忻城县| 新竹县| 酉阳| 黎城县| 甘谷县|