nip.utils.data.VariableDataCycler#

class nip.utils.data.VariableDataCycler(dataloader: Iterable, device: device | str | int | None = None, non_blocking: bool = False, default_batch_size: int | None = None)[source]#

A loader that cycles through data, but allows the batch size to vary.

If a default batch size is provided, it is possible to iterate infinitely over the data as follows:

>>> data_cycler = VariableDataCycler(dataloader, default_batch_size=32)
>>> for batch in data_cycler:
...     # Do something with the batch
Parameters:
  • dataloader (Iterable) – The base dataloader to use. This dataloader will be cycled through.

  • device (TorchDevice, optional) – The device to move the data to. If None, the data will not be moved.

  • non_blocking (bool, default=False) – Whether to move the data to the device with non_blocking=True.

  • default_batch_size (int, optional) – The default batch size to use when getting a batch and iterating over the instance. If None, the batch size must be manually specified when getting a batch, and it is not possible to iterate over the instance.

Methods Summary

__init__(dataloader[, device, non_blocking, ...])

__iter__()

__repr__()

Return repr(self).

get_batch([batch_size])

Get a batch of data from the dataloader with the given batch size.

Methods

__init__(dataloader: Iterable, device: device | str | int | None = None, non_blocking: bool = False, default_batch_size: int | None = None)[source]#
__iter__()[source]#
__repr__() str[source]#

Return repr(self).

get_batch(batch_size: int | None = None) TensorDict[source]#

Get a batch of data from the dataloader with the given batch size.

If the dataloader is exhausted, it will be reset.

Parameters:

batch_size (int, optional) – The size of the batch to return. If None, the default batch size will be used, if it was provided.

Returns:

batch (TensorDict) – A batch of data with the given batch size.