loss

package
v1.0.0 Latest Latest
Warning

This package is not in the latest version of its module.

Go to latest
Published: Feb 15, 2024 License: MIT Imports: 4 Imported by: 0

Documentation

Overview

Package loss contains loss functions for different ways of computing the degree of error between the ANN prediction and the actual labels to the data.

Index

Constants

This section is empty.

Variables

This section is empty.

Functions

This section is empty.

Types

type CCELossWithSoftmax

type CCELossWithSoftmax[T Float] struct {
	CategoricalCrossEntropyLoss[T]
}

CCELossWithSoftmax is a loss function which is used to determine the amount of error the weights should be corrected by, i.e. the cost.

As CCE is often combined with Softmax, used in the last layer, this loss function combines the effect of the two to efficiently compute the derivative.

IMPORTANT: should be only used with SoftmaxWithCCE activation function!

func (CCELossWithSoftmax[T]) ApplyDerivativeMatrix

func (l CCELossWithSoftmax[T]) ApplyDerivativeMatrix(y Matrix[T], yHat Matrix[T]) Matrix[T]

type CategoricalCrossEntropyLoss

type CategoricalCrossEntropyLoss[T Float] struct {
	Epsilon float64
}

CategoricalCrossEntropyLoss is a loss for comparing two probability distributions.

CCE(pred, label) = -label*Log(pred)

func (CategoricalCrossEntropyLoss[T]) Apply

func (l CategoricalCrossEntropyLoss[T]) Apply(y, yHat T) T

func (CategoricalCrossEntropyLoss[T]) ApplyDerivative

func (l CategoricalCrossEntropyLoss[T]) ApplyDerivative(y, yHat T) T

func (CategoricalCrossEntropyLoss[T]) ApplyDerivativeMatrix

func (l CategoricalCrossEntropyLoss[T]) ApplyDerivativeMatrix(y Matrix[T], yHat Matrix[T]) Matrix[T]

func (CategoricalCrossEntropyLoss[T]) ApplyMatrix

func (l CategoricalCrossEntropyLoss[T]) ApplyMatrix(y Matrix[T], yHat Matrix[T]) Matrix[T]

type LogLoss

type LogLoss[T Float] struct{}

LogLoss is a logarithmic loss function, used for probability prediction.

LogLoss(pred, label) = -label*Log(pred) - (1-label)*Log(1 - pred)
dLogLoss/dpred = (pred - label) / (pred - pred*pred)

func (LogLoss[T]) Apply

func (l LogLoss[T]) Apply(y, yHat T) T

func (LogLoss[T]) ApplyDerivative

func (l LogLoss[T]) ApplyDerivative(y, yHat T) T

func (LogLoss[T]) ApplyDerivativeMatrix

func (l LogLoss[T]) ApplyDerivativeMatrix(y Matrix[T], yHat Matrix[T]) Matrix[T]

func (LogLoss[T]) ApplyMatrix

func (l LogLoss[T]) ApplyMatrix(y Matrix[T], yHat Matrix[T]) Matrix[T]

type LossFunction

type LossFunction[T Float] interface {
	Apply(y T, yHat T) T
	ApplyMatrix(y Matrix[T], yHat Matrix[T]) Matrix[T]

	ApplyDerivative(y T, yHat T) T
	ApplyDerivativeMatrix(y Matrix[T], yHat Matrix[T]) Matrix[T]
}

LossFunction is a general interface for all loss functions for the final output of ANN.

Apply applies the loss function on scalars, which is in fact a respective cost function.

ApplyMatrix applies the loss function to the prediction and label matrices. Separate columns are treated as separate outputs in the batch, and therefore for a batch a 1xN matrix is produced.

ApplyDerivative applies derivative function with respect to the ANN prediction.

ApplyDerivativeMatrix applies the derivative w.r.t. ANN prediction. Separate columns are treated as separate outputs of the ANN for the batch.

func DynamicLoss

func DynamicLoss[T Float](lossName string) (LossFunction[T], error)

DynamicLoss returns a loss function by fully corresponding name. Identical to importing and initializing the function directly.

type SquareLoss

type SquareLoss[T Float] struct{}

SquareLoss is a MSE loss function.

SquareLoss(pred, label) = 0.5 * (pred-label)**2
dSquareLoss/dpred = pred - label

For convenience of producing the derivative, SquareLoss is multiplied by 1/2.

func (SquareLoss[T]) Apply

func (s SquareLoss[T]) Apply(y, yHat T) T

func (SquareLoss[T]) ApplyDerivative

func (s SquareLoss[T]) ApplyDerivative(y, yHat T) T

func (SquareLoss[T]) ApplyDerivativeMatrix

func (s SquareLoss[T]) ApplyDerivativeMatrix(y Matrix[T], yHat Matrix[T]) Matrix[T]

func (SquareLoss[T]) ApplyMatrix

func (s SquareLoss[T]) ApplyMatrix(y Matrix[T], yHat Matrix[T]) Matrix[T]

Jump to

Keyboard shortcuts

? : This menu
/ : Search site
f or F : Jump to
y or Y : Canonical URL