Skip to content

About the cross-entropy loss. #13

@aa1234241

Description

@aa1234241

Hi, I read the source code for training and came across the cross-entropy loss implementation. Here's the code snippet:

 elif self.costFunction == "cross_entropy":
       epsilon = 1e-12  # prevent zero argument in logarithm or division
       error = -(y * ncp.log(z + epsilon) + (1 - y) * ncp.log(1 - z + epsilon))

I find it fascinating that you're using binary cross-entropy loss for a multi-class classification problem. I'm curious if there's any particular reason or insight behind using it instead of the usual categorical cross-entropy loss. Thank you!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions