http://stackoverflow.com/questions/37312421/tensorflow-whats-the-difference-between-sparse-softmax-cross-entropy-with-logi

 

Having two different functions is a convenience, as they produce the same result.

The difference is simple:

  • For sparse_softmax_cross_entropy_with_logits, labels must have the shape [batch_size] and the dtype int32 or int64. Each label is an int in range [0, num_classes-1].
  • For softmax_cross_entropy_with_logits, labels must have the shape [batch_size, num_classes] and dtype float32 or float64.

Labels used in softmax_cross_entropy_with_logits are the one hot version of labels used in sparse_softmax_cross_entropy_with_logits.

Another tiny difference is that with sparse_softmax_cross_entropy_with_logits, you can give -1 as a label to have loss 0 on this label.

相关文章:

  • 2021-06-30
  • 2021-09-18
  • 2022-02-09
  • 2022-12-23
  • 2022-03-08
  • 2021-05-24
  • 2021-08-28
猜你喜欢
  • 2021-05-31
  • 2021-06-24
  • 2021-05-17
  • 2022-12-23
  • 2022-03-05
  • 2021-08-25
  • 2022-01-13
相关资源
相似解决方案