ml

Activation Functions

Sigmoid, ReLU, Tanh, GELU, Swish, ELU, Leaky ReLU, Softmax — curve plots, derivative overlays, saturation / dead-neuron analysis at any x.

Activation Functions
Input
f(x) and f′(x) — x ∈ [−5, 5]
—— Sigmoid—— ReLU—— Tanh--- derivatives
Values at x = 1.0
Functionf(x)f′(x)
Sigmoid0.731060.19661
ReLU11
Tanh0.761590.41997
Formulas
Sigmoidσ(x) = 1/(1+e⁻ˣ), σ′ = σ(1−σ)
ReLUmax(0,x), f′ = 1 if x>0
Leaky ReLUx>0 ? x : 0.01x, f′ = 1 or 0.01
Tanh(eˣ−e⁻ˣ)/(eˣ+e⁻ˣ), f′ = 1−tanh²
GELUx·σ(1.702x) approx.
ELUx>0 ? x : eˣ−1, f′ = eˣ if x≤0
Swishx·σ(x), f′ = σ+x·σ(1−σ)
Softmaxeˣⁱ/Σeˣʲ (array input)