pyanomaly.loss.functions package¶
Submodules¶
pyanomaly.loss.functions.basic_loss module¶
@author: Yuhao Cheng @contact: yuhao.cheng[at]outlook.com
-
class
pyanomaly.loss.functions.basic_loss.AMCDiscriminateLoss(loss_cfg)¶ Bases:
torch.nn.modules.module.Module-
forward(outputs, labels)¶ Defines the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Moduleinstance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
-
-
class
pyanomaly.loss.functions.basic_loss.AMCGenerateLoss(loss_cfg)¶ Bases:
torch.nn.modules.module.Module-
forward(fake_outputs, fake)¶ Defines the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Moduleinstance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
-
-
class
pyanomaly.loss.functions.basic_loss.Adversarial_Loss(loss_cfg)¶ Bases:
torch.nn.modules.module.Module-
forward(fake_outputs)¶ Defines the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Moduleinstance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
-
-
class
pyanomaly.loss.functions.basic_loss.CrossEntropyLoss(loss_cfg)¶ Bases:
torch.nn.modules.loss.CrossEntropyLossloss_cfg = [['weight', None], ['size_average', None], ['ignore_index', -100], ['reduce', None], ['reduction', 'mean']]
-
class
pyanomaly.loss.functions.basic_loss.Discriminate_Loss(loss_cfg)¶ Bases:
torch.nn.modules.module.Module-
forward(real_outputs, fake_outputs)¶ Defines the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Moduleinstance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
-
-
class
pyanomaly.loss.functions.basic_loss.GANLoss(loss_cfg)¶ Bases:
torch.nn.modules.module.ModuleDefine different GAN objectives.
The GANLoss class abstracts away the need to create the target label tensor that has the same size as the input.
https://github.com/junyanz/pytorch-CycleGAN-and-pix2pix
-
get_target_tensor(prediction, target_is_real)¶ Create label tensors with the same size as the input.
- Parameters:
prediction (tensor) - - tpyically the prediction from a discriminator target_is_real (bool) - - if the ground truth label is for real images or fake images
- Returns:
A label tensor filled with ground truth label, and with the size of the input
-
-
class
pyanomaly.loss.functions.basic_loss.GradientLoss(loss_cfg)¶ Bases:
torch.nn.modules.module.Module-
forward(gen_frames, gt_frames)¶ Defines the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Moduleinstance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
-
-
class
pyanomaly.loss.functions.basic_loss.IntensityLoss(loss_cfg)¶ Bases:
torch.nn.modules.module.Module-
forward(gen_frames, gt_frames)¶ Defines the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Moduleinstance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
-
-
class
pyanomaly.loss.functions.basic_loss.L2Loss(eps=1e-08)¶ Bases:
torch.nn.modules.module.Module-
forward(gen, gt)¶ Defines the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Moduleinstance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
-
-
class
pyanomaly.loss.functions.basic_loss.MSELoss(loss_cfg)¶ Bases:
torch.nn.modules.loss.MSELossloss_cfg = [['size_average', None], ['reduce', None], ['reduction', 'mean']]
-
class
pyanomaly.loss.functions.basic_loss.MemLoss¶ Bases:
torch.nn.modules.module.Module-
forward(att_weights)¶ Defines the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Moduleinstance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
-
-
class
pyanomaly.loss.functions.basic_loss.WeightedPredLoss¶ Bases:
torch.nn.modules.module.Module-
forward(x, target)¶ Defines the computation performed at every call.
Should be overridden by all subclasses.
Note
Although the recipe for forward pass needs to be defined within this function, one should call the
Moduleinstance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.
-
Module contents¶
@author: Yuhao Cheng @contact: yuhao.cheng[at]outlook.com