nip.utils.torch.Conv2dSimulateBatchDims#
- class nip.utils.torch.Conv2dSimulateBatchDims(in_channels: int, out_channels: int, kernel_size: int | Tuple[int, int], stride: int | Tuple[int, int] = 1, padding: str | int | Tuple[int, int] = 0, dilation: int | Tuple[int, int] = 1, groups: int = 1, bias: bool = True, padding_mode: str = 'zeros', device=None, dtype=None)[source]#
2D convolutional layer with arbitrary batch dimensions.
See
torch.nn.Conv2d
for documentation.Assumes an input of shape (… channels height width).
Methods Summary
forward
(x)Apply the module to the input tensor, simulating multiple batch dimensions.
Attributes
T_destination
call_super_init
dump_patches
feature_dims
bias
in_channels
out_channels
kernel_size
stride
padding
dilation
transposed
output_padding
groups
padding_mode
weight
training
Methods