第七色在线视频,2021少妇久久久久久久久久,亚洲欧洲精品成人久久av18,亚洲国产精品特色大片观看完整版,孙宇晨将参加特朗普的晚宴

為了賬號安全,請及時綁定郵箱和手機立即綁定
已解決430363個問題,去搜搜看,總會有你想問的

tf.keras.losses 中的“減少”參數(shù)

tf.keras.losses 中的“減少”參數(shù)

回首憶惘然 2023-07-05 15:44:19
根據(jù)文檔,該Reduction參數(shù)有 3 個值 -?SUM_OVER_BATCH_SIZE、SUM和NONE。y_true = [[0., 2.], [0., 0.]]y_pred = [[3., 1.], [2., 5.]]mae = tf.keras.losses.MeanAbsoluteError(reduction=tf.keras.losses.Reduction.SUM)mae(y_true, y_pred).numpy()> 5.5mae = tf.keras.losses.MeanAbsoluteError()mae(y_true, y_pred).numpy()> 2.75經(jīng)過各種試驗后我可以推斷出的計算結(jié)果是:-當REDUCTION = SUM,Loss = Sum over all samples {(Sum of differences between y_pred and y_target vector of each sample / No of element in y_target of the sample )} = { (abs(3-0) + abs(1-2))/2 } + { (abs(2-0) + abs(5-0))/2 } = {4/2} + {7/2} = 5.5.當REDUCTION = SUM_OVER_BATCH_SIZE,Loss = [Sum over all samples {(Sum of differences between y_pred and y_target vector of each sample / No of element in y_target of the sample )}] / Batch_size or No of Samples ?= [ { (abs(3-0)} + abs(1-2))/2 } + { (abs(2-0) + abs(5-0))/2 } ]/2 = [ {4/2} + {7/2} ]/2 = [5.5]/2 = 2.75.結(jié)果,SUM_OVER_BATCH_SIZE無非是SUM/batch_size。那么,為什么要調(diào)用它呢SUM_OVER_BATCH_SIZE?實際上是SUM將整個批次的損失相加,同時SUM_OVER_BATCH_SIZE計算該批次的平均損失。SUM_OVER_BATCH_SIZE我關(guān)于和的運作的假設(shè)是否SUM正確?
查看完整描述

1 回答

?
犯罪嫌疑人X

TA貢獻2080條經(jīng)驗 獲得超4個贊

據(jù)我了解,您的假設(shè)是正確的。


如果您檢查 github [keras/losses_utils.py][1] 第 260-269 行,您將看到它確實按預期執(zhí)行。 SUM將總結(jié)批量維度中的損失,并SUM_OVER_BATCH_SIZE除以總SUM損失數(shù)(批量大?。?/p>


def reduce_weighted_loss(weighted_losses,

                     reduction=ReductionV2.SUM_OVER_BATCH_SIZE):

  if reduction == ReductionV2.NONE:

     loss = weighted_losses

  else:

     loss = tf.reduce_sum(weighted_losses)

     if reduction == ReductionV2.SUM_OVER_BATCH_SIZE:

        loss = _safe_mean(loss, _num_elements(weighted_losses))

  return loss

您只需添加一對損失為零的輸出即可對前面的示例進行簡單檢查。


y_true = [[0., 2.], [0., 0.],[1.,1.]]

y_pred = [[3., 1.], [2., 5.],[1.,1.]]


mae = tf.keras.losses.MeanAbsoluteError(reduction=tf.keras.losses.Reduction.SUM)

mae(y_true, y_pred).numpy()

> 5.5


mae = tf.keras.losses.MeanAbsoluteError()

mae(y_true, y_pred).numpy()

> 1.8333

所以,你的假設(shè)是正確的。[1]:https://github.com/keras-team/keras/blob/v2.7.0/keras/utils/losses_utils.py#L25-L84


查看完整回答
反對 回復 2023-07-05
  • 1 回答
  • 0 關(guān)注
  • 138 瀏覽
慕課專欄
更多

添加回答

舉報

0/150
提交
取消
微信客服

購課補貼
聯(lián)系客服咨詢優(yōu)惠詳情

幫助反饋 APP下載

慕課網(wǎng)APP
您的移動學習伙伴

公眾號

掃描二維碼
關(guān)注慕課網(wǎng)微信公眾號