国产成人精品久久免费动漫-国产成人精品天堂-国产成人精品区在线观看-国产成人精品日本-a级毛片无码免费真人-a级毛片毛片免费观看久潮喷

您的位置:首頁(yè)技術(shù)文章
文章詳情頁(yè)

python使用梯度下降算法實(shí)現(xiàn)一個(gè)多線性回歸

瀏覽:2日期:2022-08-01 13:00:20

python使用梯度下降算法實(shí)現(xiàn)一個(gè)多線性回歸,供大家參考,具體內(nèi)容如下

圖示:

python使用梯度下降算法實(shí)現(xiàn)一個(gè)多線性回歸

python使用梯度下降算法實(shí)現(xiàn)一個(gè)多線性回歸

import pandas as pdimport matplotlib.pylab as pltimport numpy as np# Read data from csvpga = pd.read_csv('D:python3dataTest.csv')# Normalize the data 歸一化值 (x - mean) / (std)pga.AT = (pga.AT - pga.AT.mean()) / pga.AT.std()pga.V = (pga.V - pga.V.mean()) / pga.V.std()pga.AP = (pga.AP - pga.AP.mean()) / pga.AP.std()pga.RH = (pga.RH - pga.RH.mean()) / pga.RH.std()pga.PE = (pga.PE - pga.PE.mean()) / pga.PE.std()def cost(theta0, theta1, theta2, theta3, theta4, x1, x2, x3, x4, y): # Initialize cost J = 0 # The number of observations m = len(x1) # Loop through each observation # 通過(guò)每次觀察進(jìn)行循環(huán) for i in range(m): # Compute the hypothesis # 計(jì)算假設(shè) h=theta0+x1[i]*theta1+x2[i]*theta2+x3[i]*theta3+x4[i]*theta4 # Add to cost J += (h - y[i])**2 # Average and normalize cost J /= (2*m) return J# The cost for theta0=0 and theta1=1def partial_cost_theta4(theta0,theta1,theta2,theta3,theta4,x1,x2,x3,x4,y): h = theta0 + x1 * theta1 + x2 * theta2 + x3 * theta3 + x4 * theta4 diff = (h - y) * x4 partial = diff.sum() / (x2.shape[0]) return partialdef partial_cost_theta3(theta0,theta1,theta2,theta3,theta4,x1,x2,x3,x4,y): h = theta0 + x1 * theta1 + x2 * theta2 + x3 * theta3 + x4 * theta4 diff = (h - y) * x3 partial = diff.sum() / (x2.shape[0]) return partialdef partial_cost_theta2(theta0,theta1,theta2,theta3,theta4,x1,x2,x3,x4,y): h = theta0 + x1 * theta1 + x2 * theta2 + x3 * theta3 + x4 * theta4 diff = (h - y) * x2 partial = diff.sum() / (x2.shape[0]) return partialdef partial_cost_theta1(theta0,theta1,theta2,theta3,theta4,x1,x2,x3,x4,y): h = theta0 + x1 * theta1 + x2 * theta2 + x3 * theta3 + x4 * theta4 diff = (h - y) * x1 partial = diff.sum() / (x2.shape[0]) return partial# 對(duì)theta0 進(jìn)行求導(dǎo)# Partial derivative of cost in terms of theta0def partial_cost_theta0(theta0, theta1, theta2, theta3, theta4, x1, x2, x3, x4, y): h = theta0 + x1 * theta1 + x2 * theta2 + x3 * theta3 + x4 * theta4 diff = (h - y) partial = diff.sum() / (x2.shape[0]) return partialdef gradient_descent(x1,x2,x3,x4,y, alpha=0.1, theta0=0, theta1=0,theta2=0,theta3=0,theta4=0): max_epochs = 1000 # Maximum number of iterations 最大迭代次數(shù) counter = 0 # Intialize a counter 當(dāng)前第幾次 c = cost(theta0, theta1, theta2, theta3, theta4, x1, x2, x3, x4, y) ## Initial cost 當(dāng)前代價(jià)函數(shù) costs = [c] # Lets store each update 每次損失值都記錄下來(lái) # Set a convergence threshold to find where the cost function in minimized # When the difference between the previous cost and current cost # is less than this value we will say the parameters converged # 設(shè)置一個(gè)收斂的閾值 (兩次迭代目標(biāo)函數(shù)值相差沒(méi)有相差多少,就可以停止了) convergence_thres = 0.000001 cprev = c + 10 theta0s = [theta0] theta1s = [theta1] theta2s = [theta2] theta3s = [theta3] theta4s = [theta4] # When the costs converge or we hit a large number of iterations will we stop updating # 兩次間隔迭代目標(biāo)函數(shù)值相差沒(méi)有相差多少(說(shuō)明可以停止了) while (np.abs(cprev - c) > convergence_thres) and (counter < max_epochs): cprev = c # Alpha times the partial deriviative is our updated # 先求導(dǎo), 導(dǎo)數(shù)相當(dāng)于步長(zhǎng) update0 = alpha * partial_cost_theta0(theta0, theta1, theta2, theta3, theta4, x1, x2, x3, x4, y) update1 = alpha * partial_cost_theta1(theta0, theta1, theta2, theta3, theta4, x1, x2, x3, x4, y) update2 = alpha * partial_cost_theta2(theta0, theta1, theta2, theta3, theta4, x1, x2, x3, x4, y) update3 = alpha * partial_cost_theta3(theta0, theta1, theta2, theta3, theta4, x1, x2, x3, x4, y) update4 = alpha * partial_cost_theta4(theta0, theta1, theta2, theta3, theta4, x1, x2, x3, x4, y) # Update theta0 and theta1 at the same time # We want to compute the slopes at the same set of hypothesised parameters # so we update after finding the partial derivatives # -= 梯度下降,+=梯度上升 theta0 -= update0 theta1 -= update1 theta2 -= update2 theta3 -= update3 theta4 -= update4 # Store thetas theta0s.append(theta0) theta1s.append(theta1) theta2s.append(theta2) theta3s.append(theta3) theta4s.append(theta4) # Compute the new cost # 當(dāng)前迭代之后,參數(shù)發(fā)生更新 c = cost(theta0, theta1, theta2, theta3, theta4, x1, x2, x3, x4, y) # Store updates,可以進(jìn)行保存當(dāng)前代價(jià)值 costs.append(c) counter += 1 # Count # 將當(dāng)前的theta0, theta1, costs值都返回去 #return {’theta0’: theta0, ’theta1’: theta1, ’theta2’: theta2, ’theta3’: theta3, ’theta4’: theta4, 'costs': costs} return {’costs’:costs}print('costs =', gradient_descent(pga.AT, pga.V,pga.AP,pga.RH,pga.PE)[’costs’])descend = gradient_descent(pga.AT, pga.V,pga.AP,pga.RH,pga.PE, alpha=.01)plt.scatter(range(len(descend['costs'])), descend['costs'])plt.show()

損失函數(shù)隨迭代次數(shù)變換圖:

python使用梯度下降算法實(shí)現(xiàn)一個(gè)多線性回歸

以上就是本文的全部?jī)?nèi)容,希望對(duì)大家的學(xué)習(xí)有所幫助,也希望大家多多支持好吧啦網(wǎng)。

標(biāo)簽: Python 編程
相關(guān)文章:
主站蜘蛛池模板: 亚洲国产精品专区 | 高清国产在线播放成人 | swag国产精品一区二区 | 久久99精品九九九久久婷婷 | 国产精品久久久久国产精品 | 国产成人精品日本亚洲专一区 | 手机福利片 | 国产三级在线观看免费 | 男人把女人桶到喷白浆的视频 | 亚洲欧美日韩在线一区二区三区 | 亚洲精品成人一区二区 | 亚洲美女色成人综合 | 国产一区二区三区免费在线观看 | 九九久久精品视频 | 欧美一级毛片生活片 | 国内精品久久久久久野外 | 九九九九热精品免费视频 | 国产精品亚洲欧美日韩区 | 亚洲精品久久久中文字 | 欧美精品做人一级爱免费 | 深夜福利成人 | 成人午夜毛片在线看 | 久久免费久久 | 乱人伦中文视频在线观看免费 | 欧美一级毛片欧美一级成人毛片 | aa级毛片毛片免费观看久 | 午夜综合 | 中国一级大黄大片 | 欧美人成在线视频 | 国产99视频精品草莓免视看 | 一本久道久久综合婷婷五 | 日本一区深夜影院深a | 撸久久 | 久久精品国产一区二区三区 | 国产精品国产三级在线高清观看 | 99久久一区二区精品 | 欧美变态一级毛片 | 在线欧美视频 | 最新国产精品亚洲 | 在线国产二区 | 日韩精品无码一区二区三区 |