View resource on GitHub |

Computes softmax overcome entropy between logits and also labels. (deprecated arguments)

tf.compat.v1.nn.softmax_cross_entropy_with_logits_v2( labels, logits, axis=None, name=None, dim=None)

### Used in the notebooks

used in the tutorials**Warning:**SOME arguments ARE DEPRECATED: (dim). They will certainly be gotten rid of in a future version.Instructions because that updating:dim is deprecated, use axis insteadMeasures the probability error in discrete group tasks in which theclasses room mutually exclude, (each entrance is in precisely one class). Forexample, every CIFAR-10 picture is labeled with one and also only one label: an imagecan it is in a dog or a truck, yet not both.

You are watching: Tf.nn.softmax_cross_entropy_with_logits_v2

**Note:**while the classes room mutually exclusive, their probabilitiesneed not be. All that is compelled is that each row of labels isa precious probability distribution. If they room not, the computation that thegradient will be incorrect.

If using exclusive labels (wherein one and onlyone course is true at a time), watch sparse_softmax_cross_entropy_with_logits.

**Warning:**This op expects unscaled logits, since it performs a softmaxon logits internally for efficiency. Perform not contact this op with theoutput that softmax, together it will create incorrect results.

A usual use situation is to have logits and labels the shape

logits and labels must have the very same dtype (either float16, float32,or float64).

Backpropagation will occur into both logits and also labels. Come disallowbackpropagation right into labels, pass label tensors with tf.stop_gradientbefore feeding it to this function.

**Note that to avoid confusion, that is required to pass only named arguments tothis function.**

## Args

labels | Each vector follow me the course dimension should organize a validprobability circulation e.g. For the case in i beg your pardon labels are of shape have to be a validprobability distribution. |

logits | Unscaled log in probabilities. |

axis | The class dimension. Defaulted to -1 i beg your pardon is the critical dimension. |

name | A surname for the operation (optional). |

dim | Deprecated alias because that axis. |

## Returns

A Tensor that consists of the softmax overcome entropy loss. Its kind is thesame as logits and its form is the exact same as labels other than that the doesnot have the last measurement of labels. See more: Richard Speight Jr. Movies And Tv Shows, Richard Speight Jr |

Except together otherwise noted, the content of this page is license is granted under the an innovative Commons Attribution 4.0 License, and code samples space licensed under the Apache 2.0 License. For details, view the Google Developers site Policies. Java is a registered trademark of Oracle and/or that is affiliates. Some content is license is granted under the numpy license.