Machine Learning and Titanic
I first learned about Kaggle’s Titanic — Machine Learning from Disaster a few months ago. At that time I knew about Logistic Regression, which can be used as a classifier if there are only two categories. Since then, I’ve learned a bit about other classifiers. So, I thought of practicing them on the same problem.
The details of my attempt are in the notebook available at https://www.kaggle.com/soheilsolhjoo/titanic-comparing-different-classifiers.
I tried the following classifiers:
- LR: Logistic Regression
- KN: K-Nearest Neighbors
- DT: Decision Tree
- RF: Random Forest
- AB: AdaBoost
- SV: Support Vector Machine
- MP: Multi-layer Perceptron
The results showed that LR and SVM could predict the survival of the passengers with the highest scores of 0.76 and 0.77, respectively, between the tested methods, and of course, the way I worked with them.