1
0
Fork 0

adding calibration methods from abstension package

This commit is contained in:
Alejandro Moreo Fernandez 2023-01-17 13:53:48 +01:00
parent 8b0b9f522a
commit 6e910075ab
2 changed files with 5 additions and 0 deletions

View File

@ -34,7 +34,10 @@
- newer versions of numpy raise a warning when accessing types (e.g., np.float). I have replaced all such instances
with the plain python type (e.g., float).
- new dependency "abstention" (to add to the project requirements and setup)
Things to fix:
- calibration with recalibration methods has to be fixed for exact_train_prev in EMQ (conflicts with clone, deepcopy, etc.)
- clean functions like binary, aggregative, probabilistic, etc; those should be resolved via isinstance():
this is not working; I don't know how to make the isinstance work. Looks like there is some problem with the
path of the imported class wrt the path of the class that arrives from another module...

View File

@ -10,6 +10,7 @@ from sklearn.model_selection import StratifiedKFold, cross_val_predict
from tqdm import tqdm
import quapy as qp
import quapy.functional as F
from classification.calibration import RecalibratedClassifier
from quapy.classification.svmperf import SVMperf
from quapy.data import LabelledCollection
from quapy.method.base import BaseQuantifier, BinaryQuantifier
@ -137,6 +138,7 @@ class AggregativeProbabilisticQuantifier(AggregativeQuantifier):
else:
key_prefix = 'base_estimator__'
parameters = {key_prefix + k: v for k, v in parameters.items()}
self.learner.set_params(**parameters)