Linear vs ReLU activation

wkoityezc report abuse

It seems ReLU is often used as an activation function in neural networks? But could not quite understand why. What advantages does in have compared to regular Linear Activation?

Answers

JuneMOOre report abuse

This is a quite vague question, I would recommend reading more about activation functions and ReLU in particular. But here is a short answer: Linear activation function gives you linear relationship between input and output, and you generally want your model to be non-linear. You can achieve this with several other activation functions, including ReLU.

Comments
Tinkstudent report abuse
Do you have a good book advise?
Ifful report abuse

Making all layers in your model linear will essentially collapse your model to two layers, making other layers unefective over time. You really need non-linear layers in your network.

Add Answer

Need support?

Just drop us an email to ... Show more