Skip to content

Commit b177345

Browse files
committed
Fix docs due to improved maxTrials semantics
1 parent d169c43 commit b177345

File tree

2 files changed

+3
-3
lines changed
  • doc/modules/ROOT/pages/machine-learning

2 files changed

+3
-3
lines changed

doc/modules/ROOT/pages/machine-learning/linkprediction-pipelines/training.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -262,7 +262,7 @@ RETURN
262262
[opts="header", cols="6, 2, 2, 2, 6"]
263263
|===
264264
| winningModel | avgTrainScore | outerTrainScore | testScore | validationScores
265-
| {maxDepth=2147483647, minLeafSize=1, criterion=GINI, minSplitSize=2, numberOfDecisionTrees=10, methodName=RandomForest, numberOfSamplesRatio=1.0} | 0.779365079365079 | 0.788888888888889 | 0.766666666666667 | [0.3333333333333333, 0.6388888888888888, 0.3333333333333333, 0.3333333333333333, 0.3333333333333333]
265+
| {maxDepth=2147483647, minLeafSize=1, criterion=GINI, minSplitSize=2, numberOfDecisionTrees=10, methodName=RandomForest, numberOfSamplesRatio=1.0} | 0.779365079365079 | 0.788888888888889 | 0.766666666666667 | [0.3333333333333333, 0.6388888888888888, 0.3333333333333333, 0.3333333333333333]
266266
|===
267267

268268
We can see the RandomForest model configuration with `numberOfDecisionTrees = 10` (and defaults filled for remaining parameters) was selected, and has a score of `0.77` on the test set.

doc/modules/ROOT/pages/machine-learning/node-property-prediction/nodeclassification-pipelines/training.adoc

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -245,7 +245,7 @@ YIELD requiredMemory
245245
[opts="header"]
246246
|===
247247
| requiredMemory
248-
| +"[1264 KiB ... 1338 KiB]"+
248+
| +"[1186 KiB ... 1260 KiB]"+
249249
|===
250250
--
251251

@@ -283,7 +283,7 @@ RETURN
283283
[opts="header", cols="8, 2, 2, 2, 8"]
284284
|===
285285
| winningModel | avgTrainScore | outerTrainScore | testScore | validationScores
286-
| {maxEpochs=100, minEpochs=1, penalty=0.0, patience=1, methodName=LogisticRegression, batchSize=100, tolerance=0.001, learningRate=0.001} | 0.999999989939394 | 0.9999999912121211 | 0.999999985 | [0.4909090835454547, 0.07272727163636365, 0.4909090835454547, 0.4909090835454547, 0.4909090835454547]
286+
| {maxEpochs=100, minEpochs=1, penalty=0.0, patience=1, methodName=LogisticRegression, batchSize=100, tolerance=0.001, learningRate=0.001} | 0.999999989939394 | 0.9999999912121211 | 0.999999985 | [0.4909090835454547, 0.07272727163636365, 0.4909090835454547, 0.4909090835454547]
287287
|===
288288

289289
Here we can observe that the model candidate with penalty `0.0625` performed the best in the training phase, with an `F1_WEIGHTED` score nearing 1 over the train graph as well as on the test graph.

0 commit comments

Comments
 (0)