Skip to content

One Problem about backward mask in Language model cost function #9

@wugh

Description

@wugh

Hi,

In the laber.py line 243:

                lmcost_bw_mask = tf.sequence_mask(sentence_lengths, maxlen=tf.shape(target_ids)[1])[:,:-1]

The mask have some issue, for example

origin_seq: 1 2 3 4 0 0 0
origin_mask: 1 1 1 1 0 0 0
lmcost_bw_mask: 1 1 1 1 0 0
the correct lmcost_bw_mask here should be: 1 1 1 0 0 0

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions