第七色在线视频,2021少妇久久久久久久久久,亚洲欧洲精品成人久久av18,亚洲国产精品特色大片观看完整版,孙宇晨将参加特朗普的晚宴

為了賬號安全,請及時綁定郵箱和手機(jī)立即綁定
已解決430363個問題,去搜搜看,總會有你想問的

Implement gradient_check_n()運算結(jié)果和預(yù)期不一樣

Implement gradient_check_n()運算結(jié)果和預(yù)期不一樣

#?GRADED?FUNCTION:?gradient_check_n def?gradient_check_n(parameters,?gradients,?X,?Y,?epsilon?=?1e-7): ????""" ????Checks?if?backward_propagation_n?computes?correctly?the?gradient?of?the?cost?output?by?forward_propagation_n ???? ????Arguments: ????parameters?--?python?dictionary?containing?your?parameters?"W1",?"b1",?"W2",?"b2",?"W3",?"b3": ????grad?--?output?of?backward_propagation_n,?contains?gradients?of?the?cost?with?respect?to?the?parameters.? ????x?--?input?datapoint,?of?shape?(input?size,?1) ????y?--?true?"label" ????epsilon?--?tiny?shift?to?the?input?to?compute?approximated?gradient?with?formula(1) ???? ????Returns: ????difference?--?difference?(2)?between?the?approximated?gradient?and?the?backward?propagation?gradient ????""" ???? ????#?Set-up?variables ????parameters_values,?_?=?dictionary_to_vector(parameters) ????grad?=?gradients_to_vector(gradients) ????num_parameters?=?parameters_values.shape[0] ????J_plus?=?np.zeros((num_parameters,?1)) ????J_minus?=?np.zeros((num_parameters,?1)) ????gradapprox?=?np.zeros((num_parameters,?1)) ???? ????#?Compute?gradapprox ????for?i?in?range(num_parameters): ???????? ????????#?Compute?J_plus[i].?Inputs:?"parameters_values,?epsilon".?Output?=?"J_plus[i]". ????????#?"_"?is?used?because?the?function?you?have?to?outputs?two?parameters?but?we?only?care?about?the?first?one ????????###?START?CODE?HERE?###?(approx.?3?lines) ????????thetaplus?=?np.copy(parameters_values)????????????????????????????????????#?Step?1 ????????thetaplus[i][0]?=?thetaplus[i][0]+epsilon???????????????????????????????#?Step?2 ????????J_plus[i],?_?=?forward_propagation_n(X,Y,vector_to_dictionary(thetaplus))??????????????????????????????????#?Step?3 ????????###?END?CODE?HERE?### ???????? ????????#?Compute?J_minus[i].?Inputs:?"parameters_values,?epsilon".?Output?=?"J_minus[i]". ????????###?START?CODE?HERE?###?(approx.?3?lines) ????????thetaminus?=?np.copy(parameters_values)?????????????????????????????????????#?Step?1 ????????thetaminus[i][0]?=?thetaplus[i][0]-epsilon????????????????????????????????#?Step?2???????? ????????J_minus[i],?_?=?forward_propagation_n(X,Y,vector_to_dictionary(thetaminus))??????????????????????????????????#?Step?3 ????????###?END?CODE?HERE?### ???????? ????????#?Compute?gradapprox[i] ????????###?START?CODE?HERE?###?(approx.?1?line) ????????gradapprox[i]?=?(J_plus[i]-J_minus[i])/(2*epsilon) ????????###?END?CODE?HERE?### ???? ????#?Compare?gradapprox?to?backward?propagation?gradients?by?computing?difference. ????###?START?CODE?HERE?###?(approx.?1?line) ????numerator?=?np.linalg.norm(gradapprox-grad)???????????????????????????????????????????#?Step?1' ????denominator?=?np.linalg.norm(grad)+np.linalg.norm(gradapprox)?????????????????????????????????????????#?Step?2' ????difference?=?numerator/denominator??????????????????????????????????????????#?Step?3' ????###?END?CODE?HERE?### ????if?difference?>?1e-7: ????????print?("\033[93m"?+?"There?is?a?mistake?in?the?backward?propagation!?difference?=?"?+?str(difference)?+?"\033[0m") ????else: ????????print?("\033[92m"?+?"Your?backward?propagation?works?perfectly?fine!?difference?=?"?+?str(difference)?+?"\033[0m") ???? ????return?difference
查看完整描述

1 回答

?
慕移動1217771

TA貢獻(xiàn)1條經(jīng)驗 獲得超0個贊

你好,還在么,想問一下問題解決了嗎,剛學(xué)到這,結(jié)果和你一樣,怎么找都沒找到問題所在

查看完整回答
反對 回復(fù) 2019-02-05
  • 1 回答
  • 0 關(guān)注
  • 2586 瀏覽
慕課專欄
更多

添加回答

舉報

0/150
提交
取消
微信客服

購課補貼
聯(lián)系客服咨詢優(yōu)惠詳情

幫助反饋 APP下載

慕課網(wǎng)APP
您的移動學(xué)習(xí)伙伴

公眾號

掃描二維碼
關(guān)注慕課網(wǎng)微信公眾號