Logistic Regression
Logistic Regression predicts the probability that an input belongs to a class.
📌 Output is always between 0 and 1
1 → Yes / True / Positive0 → No / False / Negative| Problem | Output |
|---|---|
| Student Pass / Fail | 1 or 0 |
| Email Spam / Not Spam | 1 or 0 |
| Disease Present / Not Present | 1 or 0 |
| Loan Approved / Rejected | 1 or 0 |
Linear regression can give values like:
-10, 1.5, 120
❌ Not suitable for classification.
Logistic Regression solves this by using the Sigmoid Function.
Formula:
σ(z) = 1 / (1 + e^(-z))
📈 It converts any value into 0 to 1.
| z value | Output |
|---|---|
| -∞ | 0 |
| 0 | 0.5 |
| +∞ | 1 |
1️⃣ Take input features 2️⃣ Multiply by weights 3️⃣ Add bias 4️⃣ Apply sigmoid function 5️⃣ Convert probability to class
📌 Decision rule:
If probability ≥ 0.5 → Class 1
Else → Class 0
| Study Hours | Result |
|---|---|
| 1 | Fail (0) |
| 2 | Fail (0) |
| 3 | Pass (1) |
| 4 | Pass (1) |
| 5 | Pass (1) |
📌 The model learns:
“More study hours → higher chance of passing”
from sklearn.linear_model import LogisticRegression
X = [[1], [2], [3], [4], [5]]
y = [0, 0, 1, 1, 1]
model = LogisticRegression()
model.fit(X, y)
# Predict pass/fail for 2.5 hours
print(model.predict([[2.5]]))
# Predict probability
print(model.predict_proba([[2.5]]))
| Feature | Linear Regression | Logistic Regression |
|---|---|---|
| Output | Continuous | Binary (0/1) |
| Used for | Prediction | Classification |
| Function | Straight line | Sigmoid curve |
| Range | (-∞ to +∞) | (0 to 1) |
✅ Simple and fast ✅ Easy to interpret ✅ Works well for binary classification ✅ Less computation
❌ Only for binary classification ❌ Assumes linear relationship ❌ Not good for complex data