nntoolbox.metrics.classification module

class nntoolbox.metrics.classification.Accuracy[source]

Bases: nntoolbox.metrics.metrics.Metric

class nntoolbox.metrics.classification.BinaryAccuracy[source]

Bases: nntoolbox.metrics.metrics.Metric

class nntoolbox.metrics.classification.MAPAtK(k: int = 5)[source]

Bases: nntoolbox.metrics.metrics.Metric

map_at_k(best, labels) → float[source]
class nntoolbox.metrics.classification.Perplexity[source]

Bases: nntoolbox.metrics.metrics.Metric

Perplexity metric to evaluate a language model:

perplexity(language_model, sentence) = exp(-log language_model(sentence))

class nntoolbox.metrics.classification.ROCAUCScore[source]

Bases: nntoolbox.metrics.metrics.Metric