第七色在线视频,2021少妇久久久久久久久久,亚洲欧洲精品成人久久av18,亚洲国产精品特色大片观看完整版,孙宇晨将参加特朗普的晚宴

為了賬號(hào)安全,請(qǐng)及時(shí)綁定郵箱和手機(jī)立即綁定
已解決430363個(gè)問題,去搜搜看,總會(huì)有你想問的

形狀未在漸變下降中對(duì)齊

形狀未在漸變下降中對(duì)齊

牛魔王的故事 2022-06-02 11:59:57
我試圖通過計(jì)算函數(shù)中的值來獲得梯度下降。并在我的代碼中出錯(cuò)def gradient_descent(X, y, theta, alpha, num_iters):m = len(y)cost_history = np.zeros(num_iters)theta_history = np.zeros((num_iters,2))for i in range(num_iters):    prediction = np.reshape(np.dot(np.transpose(theta), X),97)    theta = theta -(1/m)*alpha*( X.T.dot((prediction - y)))    theta_history[i,:] =theta.T    J_history[i]  = cal_cost(theta,X,y)return theta, J_history"""Args----X (numpy mxn array) - The example inputs, first column is expected   to be all 1's.y (numpy m array) - A vector of the correct outputs of length mtheta (numpy nx1 array) - An array of the set of theta parameters   to evaluatealpha (float) - The learning rate to use for the iterative gradient   descentnum_iters (int) - The number of gradient descent iterations to performReturns-------theta (numpy nx1 array) - The final theta parameters discovered after    out gradient descent.J_history (numpy num_itersx1 array) - A history of the calculated    cost for each iteration of our descent."""以下是我傳遞給函數(shù)和變量的參數(shù)theta = np.zeros( (2, 1) )iterations = 1500;alpha = 0.01theta, J = gradient_descent(X, y, theta, alpha, iterations)錯(cuò)誤信息是:ValueError:形狀(97,2)和(97,)未對(duì)齊:2(dim 1)!= 97(dim 0)
查看完整描述

1 回答

?
大話西游666

TA貢獻(xiàn)1817條經(jīng)驗(yàn) 獲得超14個(gè)贊

我不確定你在哪里得到 ValueError,但形狀為 (97,) 的 ndarray 需要np.expand_dims在其上運(yùn)行,如下所示:

np.expand_dims(vector, axis=-1)

這將使向量具有形狀 (97,1),因此它應(yīng)該被對(duì)齊/能夠被廣播。


查看完整回答
反對(duì) 回復(fù) 2022-06-02
  • 1 回答
  • 0 關(guān)注
  • 87 瀏覽
慕課專欄
更多

添加回答

舉報(bào)

0/150
提交
取消
微信客服

購課補(bǔ)貼
聯(lián)系客服咨詢優(yōu)惠詳情

幫助反饋 APP下載

慕課網(wǎng)APP
您的移動(dòng)學(xué)習(xí)伙伴

公眾號(hào)

掃描二維碼
關(guān)注慕課網(wǎng)微信公眾號(hào)