用于理解RNN结构的两张图:

RNN计算loss function
RNN计算loss function

Remember that our target at every time step is to predict the next character in the sequence. So our labels should look just like our inputs but offset by one character. Let’s look at corresponding inputs and outputs to make sure everything lined up as expected.

print(textify(train_data[10, :, 3]))
print(textify(train_label[10, :, 3]))

te, but the twisted crystalline bars lay unfinished upon the
ben
e, but the twisted crystalline bars lay unfinished upon the
benc

Averaging the loss over the sequence

def average_ce_loss(outputs, labels):
    assert(len(outputs) == len(labels))
    total_loss = 0.
    for (output, label) in zip(outputs,labels):
        total_loss = total_loss + cross_entropy(output, label)
    return total_loss / len(outputs)

相关文章:

  • 2022-12-23
  • 2021-06-17
  • 2021-11-27
  • 2021-10-28
  • 2021-07-09
  • 2022-12-23
  • 2021-11-30
猜你喜欢
  • 2022-02-16
  • 2021-10-08
  • 2021-10-20
  • 2021-11-18
  • 2021-08-17
  • 2021-11-19
  • 2022-01-03
相关资源
相似解决方案