pyanomaly.datatools package¶
Subpackages¶
- pyanomaly.datatools.abstract package
- Submodules
- pyanomaly.datatools.abstract.abstract_datasets_builder module
- pyanomaly.datatools.abstract.abstract_datasets_factory module
- pyanomaly.datatools.abstract.abstract_evaluate_method module
- pyanomaly.datatools.abstract.image_dataset module
- pyanomaly.datatools.abstract.readers module
- pyanomaly.datatools.abstract.video_dataset module
- Module contents
- pyanomaly.datatools.dataclass package
- pyanomaly.datatools.evaluate package
Submodules¶
pyanomaly.datatools.datatools_api module¶
@author: Yuhao Cheng @contact: yuhao.cheng[at]outlook.com
-
class
pyanomaly.datatools.datatools_api.DataAPI(cfg, is_training)¶ Bases:
pyanomaly.datatools.abstract.abstract_datasets_builder.AbstractBuilderThe API class is used for getting the torch.data.Dataloader dictionary
-
NAME= 'DatasetAPI'¶
-
build()¶ The building method Args:
None
- Returns:
dataloader_dict(OrderedDict): The dictionary contains the dataloader used in training or val/test process For example: {
- 'train':{
- 'general_data_dict':{
'all': torch.data.Dataloader
}, 'w_dataset_dict': {
'01': torch.data.Dataloader, ... ...
}, ... ...
}, 'val':{
- 'general_dataset_dict':{
'01': torch.data.Dataloader, '02': torch.data.Dataloader, '03': torch.data.Dataloder, ... ...
}, ... ...
}
}
-
-
class
pyanomaly.datatools.datatools_api.EvaluateAPI(cfg, is_training)¶ Bases:
objectThe class to get the Evaluation object
pyanomaly.datatools.datatools_registry module¶
@author: Yuhao Cheng @contact: yuhao.cheng[at]outlook.com
pyanomaly.datatools.tools module¶
@author: Yuhao Cheng @contact: yuhao.cheng[at]outlook.com
-
class
pyanomaly.datatools.tools.RecordResult(fpr=None, tpr=None, thresholds=None, auc=- inf, dataset=None, loss_file=None, sigma=0)¶ Bases:
object-
get_threshold()¶
-
-
pyanomaly.datatools.tools.collect_fn(batch)¶ image_b, image_a, image, image_f, label, detection_result = batch
-
pyanomaly.datatools.tools.collect_fn_local(batch)¶ image_b, image_a, image, image_f, crop_objects = batch
-
pyanomaly.datatools.tools.decide_back_front(dataset_path, verbose='testing_vad', duration=3)¶ decide the back and front, save in json Args:
dataset_path: e.g. './data/shanghaitech/training/frames' duration: step, e.g. current-3, current+3
-
pyanomaly.datatools.tools.make_global_db(data_path, split='testing')¶ Make the database based on the detections Args:
data_path: e.g. 'data/shanghaitech/normal'
-
pyanomaly.datatools.tools.make_objects_box_db(data_path, split='training', det_threshold=0.95, time_file='./training_3.json', verbose='none')¶ Make the database based on the detections Args:
data_path: e.g. 'data/shanghaitech/normal' det_threshold: 0.5
-
pyanomaly.datatools.tools.make_objects_db(data_path, split='training', det_threshold=0.95, time_file='./training_3.json', verbose='none')¶ Make the database based on the detections Args:
data_path: e.g. 'data/shanghaitech/normal' det_threshold: 0.5