Logistic regression uses a sigmoid function to perform binary classification, i.e. when you can linearly separate two classes of data from each other.
July 10, 2023
•
1
min read
Advantages
- Logistic Regression is one of the simplest machine learning algorithms and is easy to implement, yet it provides great training efficiency in some cases. Also due to these reasons, training a model with this algorithm doesn't require high computation power.
- Logistic Regression works well for cases where the dataset is linearly separable: A dataset is said to be linearly separable if it is possible to draw a straight line that can separate the two classes of data from each other. Logistic regression is used when your Y variable can take only two values, and if the data is linearly separable, this algorithm is an efficient way to classify it into two separate classes.
Assumptions
- Binary logistic regression requires the dependent variable to be categorical and binary.
- Logistic regression requires the observations to be independent of each other. In other words, the observations should not come from repeated measurements or matched data.
- Logistic regression requires there to be little or no multicollinearity among the independent variables. This means that the independent variables should not be too highly correlated with each other.