👉 ReLU, or Rectified Linear Unit, is a non-linear activation function that was introduced by Andrew Ng in his 2016 course on machine learning. It is often used to introduce students to the basics of artificial neural networks and can be seen as a powerful tool for training deep learning models. The name "ReLU" comes from the fact that it is also known as Rectified Linear Unit, which means it returns the smallest positive value or 0 if input data falls outside of its range