analogvnn.nn.activation.Gaussian#

Module Contents#

Classes#

Gaussian

Implements the Gaussian activation function.

GeLU

Implements the Gaussian error linear unit (GeLU) activation function.

class analogvnn.nn.activation.Gaussian.Gaussian[source]#

Bases: analogvnn.nn.activation.Activation.Activation

Implements the Gaussian activation function.

static forward(x: torch.Tensor) torch.Tensor[source]#

Forward pass of the Gaussian activation function.

Parameters:

x (Tensor) – the input tensor.

Returns:

the output tensor.

Return type:

Tensor

backward(grad_output: Optional[torch.Tensor]) Optional[torch.Tensor][source]#

Backward pass of the Gaussian activation function.

Parameters:

grad_output (Optional[Tensor]) – the gradient of the output tensor.

Returns:

the gradient of the input tensor.

Return type:

Optional[Tensor]

class analogvnn.nn.activation.Gaussian.GeLU[source]#

Bases: analogvnn.nn.activation.Activation.Activation

Implements the Gaussian error linear unit (GeLU) activation function.

static forward(x: torch.Tensor) torch.Tensor[source]#

Forward pass of the Gaussian error linear unit (GeLU) activation function.

Parameters:

x (Tensor) – the input tensor.

Returns:

the output tensor.

Return type:

Tensor

backward(grad_output: Optional[torch.Tensor]) Optional[torch.Tensor][source]#

Backward pass of the Gaussian error linear unit (GeLU) activation function.

Parameters:

grad_output (Optional[Tensor]) – the gradient of the output tensor.

Returns:

the gradient of the input tensor.

Return type:

Optional[Tensor]