View source on GitHub |
Leaky version of a Rectified Linear Unit activation layer.
Inherits From: Layer, Operation
tf.keras.layers.LeakyReLU( negative_slope=0.3, **kwargs ) Used in the notebooks
| Used in the guide | Used in the tutorials |
|---|---|
This layer allows a small gradient when the unit is not active.
Formula:
f(x) = alpha * x if x < 0 f(x) = x if x >= 0 Example:
leaky_relu_layer = LeakyReLU(negative_slope=0.5) input = np.array([-10, -5, 0.0, 5, 10]) result = leaky_relu_layer(input) # result = [-5. , -2.5, 0. , 5. , 10.] Args | |
|---|---|
negative_slope | Float >= 0.0. Negative slope coefficient. Defaults to 0.3. |
**kwargs | Base layer keyword arguments, such as name and dtype. |
Methods
from_config
@classmethodfrom_config( config )
Creates a layer from its config.
This method is the reverse of get_config, capable of instantiating the same layer from the config dictionary. It does not handle layer connectivity (handled by Network), nor weights (handled by set_weights).
| Args | |
|---|---|
config | A Python dictionary, typically the output of get_config. |
| Returns | |
|---|---|
| A layer instance. |
symbolic_call
symbolic_call( *args, **kwargs )
View source on GitHub