DecisionTree Module

A Decision Tree model.

The doc is here : http://scikit-learn.org/dev/modules/generated/sklearn.tree.DecisionTreeClassifier.html#sklearn.tree.DecisionTreeClassifier


Sortie du script

$ python DecisionTree.py
Opening the file 'train.csv' and 'test.csv'...
Find the best value for the meta parameter max_depth, with 10 run for each...
Searching in the range : xrange(1, 30)...
Using the first part (67.00%, 596 passengers) of the training dataset as training, 
and the second part (33.00%, 295 passengers) as testing !
For max_depth=1, learning from the first part of the dataset...
... this value of max_depth seems to have a (mean) quality = 77.35%...
For max_depth=2, learning from the first part of the dataset...
... this value of max_depth seems to have a (mean) quality = 77.35%...
For max_depth=3, learning from the first part of the dataset...
... this value of max_depth seems to have a (mean) quality = 81.04%...
For max_depth=4, learning from the first part of the dataset...
... this value of max_depth seems to have a (mean) quality = 81.71%...
For max_depth=5, learning from the first part of the dataset...
... this value of max_depth seems to have a (mean) quality = 82.89%...
For max_depth=6, learning from the first part of the dataset...
... this value of max_depth seems to have a (mean) quality = 84.23%...
For max_depth=7, learning from the first part of the dataset...
... this value of max_depth seems to have a (mean) quality = 84.90%...
For max_depth=8, learning from the first part of the dataset...
... this value of max_depth seems to have a (mean) quality = 84.40%...
For max_depth=9, learning from the first part of the dataset...
... this value of max_depth seems to have a (mean) quality = 85.57%...
For max_depth=10, learning from the first part of the dataset...
... this value of max_depth seems to have a (mean) quality = 85.91%...
For max_depth=11, learning from the first part of the dataset...
... this value of max_depth seems to have a (mean) quality = 86.24%...
For max_depth=12, learning from the first part of the dataset...
... this value of max_depth seems to have a (mean) quality = 86.07%...
For max_depth=13, learning from the first part of the dataset...
... this value of max_depth seems to have a (mean) quality = 87.08%...
For max_depth=14, learning from the first part of the dataset...
... this value of max_depth seems to have a (mean) quality = 87.42%...
For max_depth=15, learning from the first part of the dataset...
... this value of max_depth seems to have a (mean) quality = 87.92%...
For max_depth=16, learning from the first part of the dataset...
... this value of max_depth seems to have a (mean) quality = 87.92%...
For max_depth=17, learning from the first part of the dataset...
... this value of max_depth seems to have a (mean) quality = 88.26%...
For max_depth=18, learning from the first part of the dataset...
... this value of max_depth seems to have a (mean) quality = 88.26%...
For max_depth=19, learning from the first part of the dataset...
... this value of max_depth seems to have a (mean) quality = 88.26%...
For max_depth=20, learning from the first part of the dataset...
... this value of max_depth seems to have a (mean) quality = 88.26%...
For max_depth=21, learning from the first part of the dataset...
... this value of max_depth seems to have a (mean) quality = 88.26%...
For max_depth=22, learning from the first part of the dataset...
... this value of max_depth seems to have a (mean) quality = 88.26%...
For max_depth=23, learning from the first part of the dataset...
... this value of max_depth seems to have a (mean) quality = 88.26%...
For max_depth=24, learning from the first part of the dataset...
... this value of max_depth seems to have a (mean) quality = 88.26%...
For max_depth=25, learning from the first part of the dataset...
... this value of max_depth seems to have a (mean) quality = 88.26%...
For max_depth=26, learning from the first part of the dataset...
... this value of max_depth seems to have a (mean) quality = 88.26%...
For max_depth=27, learning from the first part of the dataset...
... this value of max_depth seems to have a (mean) quality = 88.26%...
For max_depth=28, learning from the first part of the dataset...
... this value of max_depth seems to have a (mean) quality = 88.26%...
For max_depth=29, learning from the first part of the dataset...
... this value of max_depth seems to have a (mean) quality = 88.26%...
With trying each of the following max_depth (xrange(1, 30)), each 10 times, the best one is 17. (for a quality = 88.26%)
Find the best value for the meta parameter min_samples_split, with 10 run for each...
Searching in the range : xrange(1, 10)...
Using the first part (67.00%, 596 passengers) of the training dataset as training, 
and the second part (33.00%, 295 passengers) as testing !
For min_samples_split=1, learning from the first part of the dataset...
... this value of min_samples_split seems to have a (mean) quality = 88.26%...
For min_samples_split=2, learning from the first part of the dataset...
... this value of min_samples_split seems to have a (mean) quality = 88.26%...
For min_samples_split=3, learning from the first part of the dataset...
... this value of min_samples_split seems to have a (mean) quality = 87.25%...
For min_samples_split=4, learning from the first part of the dataset...
... this value of min_samples_split seems to have a (mean) quality = 87.08%...
For min_samples_split=5, learning from the first part of the dataset...
... this value of min_samples_split seems to have a (mean) quality = 86.74%...
For min_samples_split=6, learning from the first part of the dataset...
... this value of min_samples_split seems to have a (mean) quality = 86.58%...
For min_samples_split=7, learning from the first part of the dataset...
... this value of min_samples_split seems to have a (mean) quality = 86.07%...
For min_samples_split=8, learning from the first part of the dataset...
... this value of min_samples_split seems to have a (mean) quality = 85.91%...
For min_samples_split=9, learning from the first part of the dataset...
... this value of min_samples_split seems to have a (mean) quality = 85.23%...
With trying each of the following min_samples_split (xrange(1, 10)), each 10 times, the best one is 1. (for a quality = 88.26%)
Find the best value for the meta parameter min_samples_leaf, with 10 run for each...
Searching in the range : xrange(1, 10)...
Using the first part (67.00%, 596 passengers) of the training dataset as training, 
and the second part (33.00%, 295 passengers) as testing !
For min_samples_leaf=1, learning from the first part of the dataset...
... this value of min_samples_leaf seems to have a (mean) quality = 88.26%...
For min_samples_leaf=2, learning from the first part of the dataset...
... this value of min_samples_leaf seems to have a (mean) quality = 83.89%...
For min_samples_leaf=3, learning from the first part of the dataset...
... this value of min_samples_leaf seems to have a (mean) quality = 83.89%...
For min_samples_leaf=4, learning from the first part of the dataset...
... this value of min_samples_leaf seems to have a (mean) quality = 84.23%...
For min_samples_leaf=5, learning from the first part of the dataset...
... this value of min_samples_leaf seems to have a (mean) quality = 82.21%...
For min_samples_leaf=6, learning from the first part of the dataset...
... this value of min_samples_leaf seems to have a (mean) quality = 82.38%...
For min_samples_leaf=7, learning from the first part of the dataset...
... this value of min_samples_leaf seems to have a (mean) quality = 80.70%...
For min_samples_leaf=8, learning from the first part of the dataset...
... this value of min_samples_leaf seems to have a (mean) quality = 81.88%...
For min_samples_leaf=9, learning from the first part of the dataset...
... this value of min_samples_leaf seems to have a (mean) quality = 81.38%...
With trying each of the following min_samples_leaf (xrange(1, 10)), each 10 times, the best one is 1. (for a quality = 88.26%)
Creating the classifier, with optimal parameters.
Learning...
 Proportion of perfect fitting for the training dataset = 97.87%
Predicting for the testing dataset
Prediction: wrote in the file csv/DecisionTree_best.csv.

Résultats

La soumission du résultat à Kaggle donne 76.07%.


DecisionTree.list_max_depth = xrange(1, 30)

Espace de recherche

DecisionTree.best_max_depth = 18

La valeur optimale trouvée pour le paramètre max_depth

DecisionTree.best_min_samples_split = 1

La valeur optimale trouvée pour le paramètre min_samples_split

DecisionTree.Number_try = 10

Nombre de tests utilisés pour méta-apprendre

DecisionTree.proportion_train = 0.67

Proportion d’individus utilisés pour méta-apprendre.

DecisionTree.best_min_samples_leaf = 1

La valeur optimale trouvée pour le paramètre min_samples_leaf

DecisionTree.score = 98.092031425364752

The score for this classifier.

Table des matières

Cette page en .txt et en .pdf

Sujet précédent

RNN Module

Sujet suivant

ExtraTrees Module