Skip to content

Commit a79dcf7

Browse files
authored
Merge pull request Open-Deep-ML#274 from Haleshot/haleshot/hard-sigmoid
New Problem: Hard Sigmoid
2 parents 4bdf8b1 + 00864ab commit a79dcf7

2 files changed

Lines changed: 69 additions & 0 deletions

File tree

Problems/96_Hard_Sigmoid/learn.md

Lines changed: 33 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,33 @@
1+
## Understanding the Hard Sigmoid Activation Function
2+
3+
The Hard Sigmoid is a piecewise linear approximation of the sigmoid activation function. It's computationally more efficient than the standard sigmoid function while maintaining similar characteristics. This function is particularly useful in neural networks where computational efficiency is crucial.
4+
5+
### Mathematical Definition
6+
7+
The Hard Sigmoid function is mathematically defined as:
8+
9+
$$
10+
HardSigmoid(x) = \begin{cases}
11+
0 & \text{if } x \leq -2.5 \\
12+
0.2x + 0.5 & \text{if } -2.5 < x < 2.5 \\
13+
1 & \text{if } x \geq 2.5
14+
\end{cases}
15+
$$
16+
17+
Where $x$ is the input to the function.
18+
19+
### Characteristics
20+
21+
- **Output Range:** The output is always bounded in the range $[0, 1]$
22+
- **Shape:** The function consists of three parts:
23+
- A constant value of 0 for inputs ≤ -2.5
24+
- A linear segment with slope 0.2 for inputs between -2.5 and 2.5
25+
- A constant value of 1 for inputs ≥ 2.5
26+
- **Gradient:** The gradient is 0.2 in the linear region and 0 in the saturated regions
27+
28+
### Advantages in Neural Networks
29+
30+
This function is particularly useful in neural networks as it provides:
31+
- Computational efficiency compared to standard sigmoid
32+
- Bounded output range similar to sigmoid
33+
- Simple gradient computation
Lines changed: 36 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,36 @@
1+
def hard_sigmoid(x: float) -> float:
2+
"""
3+
Implements the Hard Sigmoid activation function.
4+
5+
Args:
6+
x (float): Input value
7+
8+
Returns:
9+
float: The Hard Sigmoid of the input
10+
"""
11+
if x <= -2.5:
12+
return 0.0
13+
elif x >= 2.5:
14+
return 1.0
15+
else:
16+
return 0.2 * x + 0.5
17+
18+
def test_hard_sigmoid():
19+
# Test case 1: x <= -2.5
20+
assert hard_sigmoid(-3.0) == 0.0, "Test case 1 failed"
21+
22+
# Test case 2: x >= 2.5
23+
assert hard_sigmoid(3.0) == 1.0, "Test case 2 failed"
24+
25+
# Test case 3: -2.5 < x < 2.5
26+
assert abs(hard_sigmoid(0.0) - 0.5) < 1e-6, "Test case 3 failed"
27+
assert abs(hard_sigmoid(1.0) - 0.7) < 1e-6, "Test case 4 failed"
28+
assert abs(hard_sigmoid(-1.0) - 0.3) < 1e-6, "Test case 5 failed"
29+
30+
# Test boundary cases
31+
assert abs(hard_sigmoid(2.5) - 1.0) < 1e-6, "Test case 6 failed"
32+
assert abs(hard_sigmoid(-2.5) - 0.0) < 1e-6, "Test case 7 failed"
33+
34+
if __name__ == "__main__":
35+
test_hard_sigmoid()
36+
print("All Hard Sigmoid tests passed.")

0 commit comments

Comments
 (0)