imbalanced-learn
0.6.1
Getting Started
Install and contribution
Documentation
User Guide
1. Introduction
2. Over-sampling
3. Under-sampling
4. Combination of over- and under-sampling
5. Ensemble of samplers
6. Miscellaneous samplers
7. Metrics
8. Dataset loading utilities
9. Utilities for Developers
10. References
imbalanced-learn API
Tutorial - Examples
General examples
Examples based on real world datasets
Dataset examples
Evaluation examples
Model Selection
Addtional Information
Release history
About us
imbalanced-learn
Docs
»
User guide: contents
User Guide
¶
1. Introduction
1.1. API’s of imbalanced-learn samplers
1.2. Problem statement regarding imbalanced data sets
2. Over-sampling
2.1. A practical guide
2.1.1. Naive random over-sampling
2.1.2. From random over-sampling to SMOTE and ADASYN
2.1.3. Ill-posed examples
2.1.4. SMOTE variants
2.2. Mathematical formulation
2.2.1. Sample generation
2.2.2. Multi-class management
3. Under-sampling
3.1. Prototype generation
3.2. Prototype selection
3.2.1. Controlled under-sampling techniques
3.2.1.1. Mathematical formulation
3.2.2. Cleaning under-sampling techniques
3.2.2.1. Tomek’s links
3.2.2.2. Edited data set using nearest neighbours
3.2.2.3. Condensed nearest neighbors and derived algorithms
3.2.2.4. Instance hardness threshold
4. Combination of over- and under-sampling
5. Ensemble of samplers
5.1. Classifier including inner balancing samplers
5.1.1. Bagging classifier
5.1.2. Forest of randomized trees
5.1.3. Boosting
6. Miscellaneous samplers
6.1. Custom samplers
6.2. Custom generators
6.2.1. TensorFlow generator
6.2.2. Keras generator
7. Metrics
7.1. Sensitivity and specificity metrics
7.2. Additional metrics specific to imbalanced datasets
8. Dataset loading utilities
8.1. Imbalanced datasets for benchmark
8.2. Imbalanced generator
9. Utilities for Developers
9.1. Validation Tools
9.2. Deprecation
9.3. Testing utilities
10. References