Transfer functions¶
Module that keeps various transfer functions as used in the context of neural networks.
-
breze.arch.component.transfer.
tanh
(inpt)¶ Tanh activation function.
Parameters: inpt : Theano variable
Input to be transformed.
Returns: output : Theano variable
Transformed output. Same shape as
inpt
.
-
breze.arch.component.transfer.
tanhplus
(inpt)¶ Tanh with added linear activation function.
\[f(x) = tanh(x) + x\]Parameters: inpt : Theano variable
Input to be transformed.
Returns: output : Theano variable
Transformed output. Same shape as
inpt
.
-
breze.arch.component.transfer.
sigmoid
(inpt)¶ Sigmoid activation function.
\[f(x) = {1 \over 1 + \exp(-x)}\]Parameters: inpt : Theano variable
Input to be transformed.
Returns: output : Theano variable
Transformed output. Same shape as
inpt
.
-
breze.arch.component.transfer.
rectifier
(inpt)¶ Rectifier activation function.
\[f(x) = \max(0, x)\]Parameters: inpt : Theano variable
Input to be transformed.
Returns: output : Theano variable
Transformed output. Same shape as
inpt
.
-
breze.arch.component.transfer.
softplus
(inpt)¶ Soft plus activation function.
Smooth approximation to
rectifier
.\[f(x) = \log (1 + \exp(x))\]Parameters: inpt : Theano variable
Input to be transformed.
Returns: output : Theano variable
Transformed output. Same shape as
inpt
.
-
breze.arch.component.transfer.
softsign
(inpt)¶ Softsign activation function.
\[f(x) = {x \over 1 + |x|}\]Parameters: inpt : Theano variable
Input to be transformed.
Returns: output : Theano variable
Transformed output. Same shape as
inpt
.
-
breze.arch.component.transfer.
softmax
(inpt)¶ Softmax activation function.
\[f(x_i) = {\exp(x_i) \over \sum_j \exp(x_j)}\]Here, the index runs over the columns of
inpt
.Numerical stable version that subtracts the maximum of each row from all of its entries.
Wrapper for
theano.nnet.softmax
.Parameters: inpt : Theano variable
Array of shape
(n, d)
. Input to be transformed.Returns: output : Theano variable
Transformed output. Same shape as
inpt
.