analogvnn.nn.activation.ReLU#

Module Contents#

Classes#

PReLU

Implements the parametric rectified linear unit (PReLU) activation function.

ReLU

Implements the rectified linear unit (ReLU) activation function.

LeakyReLU

Implements the leaky rectified linear unit (LeakyReLU) activation function.

class analogvnn.nn.activation.ReLU.PReLU(alpha: float)[source]#

Bases: analogvnn.nn.activation.Activation.Activation

Implements the parametric rectified linear unit (PReLU) activation function.

Variables:
  • alpha (float) – the slope of the negative part of the activation function.

  • _zero (Tensor) – placeholder tensor of zero.

__constants__ = ['alpha', '_zero'][source]#
alpha: torch.nn.Parameter[source]#
_zero: torch.nn.Parameter[source]#
forward(x: torch.Tensor) torch.Tensor[source]#

Forward pass of the parametric rectified linear unit (PReLU) activation function.

Parameters:

x (Tensor) – the input tensor.

Returns:

the output tensor.

Return type:

Tensor

backward(grad_output: Optional[torch.Tensor]) Optional[torch.Tensor][source]#

Backward pass of the parametric rectified linear unit (PReLU) activation function.

Parameters:

grad_output (Optional[Tensor]) – the gradient of the output tensor.

Returns:

the gradient of the input tensor.

Return type:

Optional[Tensor]

static initialise(tensor: torch.Tensor) torch.Tensor[source]#

Initialisation of tensor using kaiming uniform, gain associated with PReLU activation function.

Parameters:

tensor (Tensor) – the tensor to be initialized.

Returns:

the initialized tensor.

Return type:

Tensor

static initialise_(tensor: torch.Tensor) torch.Tensor[source]#

In-place initialisation of tensor using kaiming uniform, gain associated with PReLU activation function.

Parameters:

tensor (Tensor) – the tensor to be initialized.

Returns:

the initialized tensor.

Return type:

Tensor

class analogvnn.nn.activation.ReLU.ReLU[source]#

Bases: PReLU

Implements the rectified linear unit (ReLU) activation function.

Variables:

alpha (float) – 0

static initialise(tensor: torch.Tensor) torch.Tensor[source]#

Initialisation of tensor using kaiming uniform, gain associated with ReLU activation function.

Parameters:

tensor (Tensor) – the tensor to be initialized.

Returns:

the initialized tensor.

Return type:

Tensor

static initialise_(tensor: torch.Tensor) torch.Tensor[source]#

In-place initialisation of tensor using kaiming uniform, gain associated with ReLU activation function.

Parameters:

tensor (Tensor) – the tensor to be initialized.

Returns:

the initialized tensor.

Return type:

Tensor

class analogvnn.nn.activation.ReLU.LeakyReLU[source]#

Bases: PReLU

Implements the leaky rectified linear unit (LeakyReLU) activation function.

Variables:

alpha (float) – 0.01