ReLu Function:
Python relu: Python has been important in improving learning models built over convolutional pictures as well as machine learning models. These deep learning models have benefited greatly because the process of building them has gotten much easier when it comes to Python’s built-in modules and functions.
In the field of deep learning, the most commonly used activation function is Relu or Rectified Linear Activation Function.
Python introduces the ReLu function, to increase the computational efficiency of deep learning models.
The ReLu function detects and displays the state of the model results, and it improves the computational efficiency of the model.
According to the ReLu activation function, if the input is negative, return 0. Otherwise, return 1.
![ReLu Function](https://www.journaldev.com/wp-content/uploads/2020/11/relu-function.jpeg.webp)
- Python Program to Reverse a String Using Recursion
- Python How to Find all Indexes of an Item in a List
- Python Program to Print Binary Representation of a Number
ReLu Function with Examples in Python
Approach:
- Create a function say ReLu which takes the given number as an argument and returns the maximum value of 0 and the number.
- Return the maximum value of 0 and the number passed as an argument.
- Give the first number as static input and store it in a variable.
- Pass the given number as an argument to the above created ReLu() function and print the result.
- Give the second number as static input and store it in another variable.
- Pass the given number as an argument to the above created ReLu() function and print the result.
- The Exit of the Program.
Below is the implementation:
# Create a function say ReLu which takes the given number as an argument # and returns the maximum value of 0 and the number def ReLu(gvn_num1): # Return the maximum value of 0 and the number passed as an argument return max(0.0, gvn_num1) # Give the first number as static input and store it in a variable gvn_num1 = 2.0 # Pass the given number as an argument to the above created ReLu() function and print the result print(ReLu(gvn_num1)) # Give the second number as static input and store it in another variable gvn_num2 = -3.0 # Pass the given number as an argument to the above created ReLu() function and print the result print(ReLu(gvn_num2))
Output:
2.0 0.0
Gradient value of the ReLu function
Relu python: When dealing with data for mining and processing, when attempting to calculate the derivative of the ReLu function, for values less than zero, i.e. negative values, the gradient is 0. This implies that the weights and biases for the learning function are not being updated in accordingly. This could cause issues with the model’s training.
We will talk about the Leaky ReLu function to solve this constraint of the ReLu function.
Leaky ReLu function
Relu activation function python: The Leaky ReLu function is an extension of the standard ReLu function. To overcome the issue of zero gradient for negative value, Leaky ReLu provides a very small linear component of x to negative inputs.
Leaky ReLu can be expressed mathematically as:
f(x) = 0.01x, x<0 = x, x>=0
Example
Approach:
- Create a function say Relu_fun() which takes the given number as an argument and returns a number.
- Check if the given number is greater than 0 using the if conditional statement.
- If it is true, then return the given number.
- Else return the given number multiplied with 0.001.
- Give the number as static input and store it in a variable.
- Pass the given number as an argument to the above created Relu_fun() function and print the result.
- The Exit of the Program.
Below is the implementation:
# Create a function say Relu_fun which takes the given number as an argument. def Relu_fun(gvn_num): # Check if the given number is greater than 0 using the if conditional statement if gvn_num>0 : # If it is true, then return the given number return gvn_num else : # Else return the given number multiplied with 0.001 return 0.001*gvn_num # Give the number as static input and store it in a variable gvn_num = -3 # Pass the given number as an argument to the above created Relu_fun() function # and print the result print(Relu_fun(gvn_num))
Output:
-0.003