第七色在线视频,2021少妇久久久久久久久久,亚洲欧洲精品成人久久av18,亚洲国产精品特色大片观看完整版,孙宇晨将参加特朗普的晚宴

為了賬號(hào)安全,請(qǐng)及時(shí)綁定郵箱和手機(jī)立即綁定
已解決430363個(gè)問(wèn)題,去搜搜看,總會(huì)有你想問(wèn)的

了解 Trax 中變壓器的介紹性示例

了解 Trax 中變壓器的介紹性示例

HUWWW 2023-06-06 17:36:07
我的目標(biāo)是理解 Trax 中變壓器的介紹性示例,import trax# Create a Transformer model.# Pre-trained model config in gs://trax-ml/models/translation/ende_wmt32k.ginmodel = trax.models.Transformer(? ? input_vocab_size=33300,? ? d_model=512, d_ff=2048,? ? n_heads=8, n_encoder_layers=6, n_decoder_layers=6,? ? max_len=2048, mode='predict')# Initialize using pre-trained weights.model.init_from_file('gs://trax-ml/models/translation/ende_wmt32k.pkl.gz',? ? ? ? ? ? ? ? ? ? ?weights_only=True)# Tokenize a sentence.sentence = 'It is nice to learn new things today!'tokenized = list(trax.data.tokenize(iter([sentence]),? # Operates on streams.? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? vocab_dir='gs://trax-ml/vocabs/',? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? vocab_file='ende_32k.subword'))[0]# Decode from the Transformer.tokenized = tokenized[None, :]? # Add batch dimension.tokenized_translation = trax.supervised.decoding.autoregressive_sample(? ? model, tokenized, temperature=0.0)? # Higher temperature: more diverse results.# De-tokenize,tokenized_translation = tokenized_translation[0][:-1]? # Remove batch and EOS.translation = trax.data.detokenize(tokenized_translation,? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ?vocab_dir='gs://trax-ml/vocabs/',? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ?vocab_file='ende_32k.subword')print(translation)這個(gè)例子工作得很好。但是,當(dāng)我嘗試使用初始化模型翻譯另一個(gè)示例時(shí),例如sentence = 'I would like to try another example.'tokenized = list(trax.data.tokenize(iter([sentence]),? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? vocab_dir='gs://trax-ml/vocabs/',? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? ? vocab_file='ende_32k.subword'))[0]tokenized = tokenized[None, :]!我在本地機(jī)器和 Google Colab 上都得到了輸出。其他示例也會(huì)發(fā)生同樣的情況。當(dāng)我構(gòu)建并初始化一個(gè)新模型時(shí),一切正常。這是一個(gè)錯(cuò)誤嗎?如果不是,這里發(fā)生了什么,我怎樣才能避免/修復(fù)這種行為?Tokenization 和 detokenization 似乎運(yùn)行良好,我對(duì)此進(jìn)行了調(diào)試。. 中的事情似乎出了問(wèn)題/出乎意料trax.supervised.decoding.autoregressive_sample。
查看完整描述

1 回答

?
猛跑小豬

TA貢獻(xiàn)1858條經(jīng)驗(yàn) 獲得超8個(gè)贊

我自己發(fā)現(xiàn)的……需要重置模型的state. 所以下面的代碼對(duì)我有用:


def translate(model, sentence, vocab_dir, vocab_file):

    empty_state = model.state # save empty state

    tokenized_sentence = next(trax.data.tokenize(iter([sentence]), vocab_dir=vocab_dir,

                                                 vocab_file=vocab_file))

    tokenized_translation = trax.supervised.decoding.autoregressive_sample(

        model, tokenized_sentence[None, :], temperature=0.0)[0][:-1]

    translation = trax.data.detokenize(tokenized_translation, vocab_dir=vocab_dir,

                                       vocab_file=vocab_file)

    model.state = empty_state # reset state

    return translation


# Create a Transformer model.

# Pre-trained model config in gs://trax-ml/models/translation/ende_wmt32k.gin

model = trax.models.Transformer(input_vocab_size=33300, d_model=512, d_ff=2048, n_heads=8,

                                n_encoder_layers=6, n_decoder_layers=6, max_len=2048,

                                mode='predict')

# Initialize using pre-trained weights.

model.init_from_file('gs://trax-ml/models/translation/ende_wmt32k.pkl.gz',

                     weights_only=True)


print(translate(model, 'It is nice to learn new things today!',

                vocab_dir='gs://trax-ml/vocabs/', vocab_file='ende_32k.subword'))

print(translate(model, 'I would like to try another example.',

                vocab_dir='gs://trax-ml/vocabs/', vocab_file='ende_32k.subword'))



查看完整回答
反對(duì) 回復(fù) 2023-06-06
  • 1 回答
  • 0 關(guān)注
  • 161 瀏覽
慕課專欄
更多

添加回答

舉報(bào)

0/150
提交
取消
微信客服

購(gòu)課補(bǔ)貼
聯(lián)系客服咨詢優(yōu)惠詳情

幫助反饋 APP下載

慕課網(wǎng)APP
您的移動(dòng)學(xué)習(xí)伙伴

公眾號(hào)

掃描二維碼
關(guān)注慕課網(wǎng)微信公眾號(hào)