dsipts.models.TIDE module

class dsipts.models.TIDE.TIDE(out_channels, past_steps, future_steps, past_channels, future_channels, embs, hidden_size, d_model, n_add_enc, n_add_dec, dropout_rate, activation='', persistence_weight=0.0, loss_type='l1', quantiles=[], optim=None, optim_config=None, scheduler_config=None, **kwargs)[source]

Bases: Base

Long-term Forecasting with TiDE: Time-series Dense Encoder https://arxiv.org/abs/2304.08424

This NN uses as subnet the ResidualBlocks, which is composed by skip connection and activation+dropout. Every encoder and decoder head is composed by one Residual Block, like the temporal decoder and the feature projection for covariates.

Parameters:
  • out_channels (int) – number of variables to be predicted

  • past_steps (int) – Lookback window length

  • future_steps (int) – Horizon window length

  • past_channels (int) – number of past variables

  • future_channels (int) – number of future auxiliary variables

  • embs (List[int])

  • hidden_size (int) – first embedding size of the model (‘r’ in the paper)

  • d_model (int) – second embedding size (r^{tilda} in the model). Should be smaller than hidden_size

  • n_add_enc (int) – number of OTHERS heads for the encoder part in the NN. 1 is always used by default.

  • n_add_dec (int) – number of OTHERS heads for the decoder part in the NN. 1 is always used by default.

  • dropout_rate (float)

  • activation (str, optional) – activation function to be used in the Residual Block. E.g., ‘nn.GELU’. Defaults to ‘’.

  • persistence_weight (float, optional) – Defaults to 0.0.

  • loss_type (str, optional) – Defaults to ‘l1’.

  • quantiles (List[float], optional) – Defaults to [].

  • optim (Union[str,None], optional) – Defaults to None.

  • optim_config (Union[dict,None], optional) – Defaults to None.

  • scheduler_config (Union[dict,None], optional) – Defaults to None.

handle_multivariate = True
handle_future_covariates = True
handle_categorical_variables = True
handle_quantile_loss = True
description = 'Can   handle multivariate output \nCan   handle future covariates\nCan   handle categorical covariates\nCan   handle Quantile loss function'
__init__(out_channels, past_steps, future_steps, past_channels, future_channels, embs, hidden_size, d_model, n_add_enc, n_add_dec, dropout_rate, activation='', persistence_weight=0.0, loss_type='l1', quantiles=[], optim=None, optim_config=None, scheduler_config=None, **kwargs)[source]

Long-term Forecasting with TiDE: Time-series Dense Encoder https://arxiv.org/abs/2304.08424

This NN uses as subnet the ResidualBlocks, which is composed by skip connection and activation+dropout. Every encoder and decoder head is composed by one Residual Block, like the temporal decoder and the feature projection for covariates.

Parameters:
  • out_channels (int) – number of variables to be predicted

  • past_steps (int) – Lookback window length

  • future_steps (int) – Horizon window length

  • past_channels (int) – number of past variables

  • future_channels (int) – number of future auxiliary variables

  • embs (List[int])

  • hidden_size (int) – first embedding size of the model (‘r’ in the paper)

  • d_model (int) – second embedding size (r^{tilda} in the model). Should be smaller than hidden_size

  • n_add_enc (int) – number of OTHERS heads for the encoder part in the NN. 1 is always used by default.

  • n_add_dec (int) – number of OTHERS heads for the decoder part in the NN. 1 is always used by default.

  • dropout_rate (float)

  • activation (str, optional) – activation function to be used in the Residual Block. E.g., ‘nn.GELU’. Defaults to ‘’.

  • persistence_weight (float, optional) – Defaults to 0.0.

  • loss_type (str, optional) – Defaults to ‘l1’.

  • quantiles (List[float], optional) – Defaults to [].

  • optim (Union[str,None], optional) – Defaults to None.

  • optim_config (Union[dict,None], optional) – Defaults to None.

  • scheduler_config (Union[dict,None], optional) – Defaults to None.

forward(batch)[source]

training process of the diffusion network

Parameters:

batch (dict) – variables loaded

Returns:

total loss about the prediction of the noises over all subnets extracted

Return type:

float

cat_categorical_vars(batch)[source]

Extracting categorical context about past and future

Parameters:

batch (dict) – Keys checked -> [‘x_cat_past’, ‘x_cat_future’]

Returns:

cat_emb_past, cat_emb_fut

Return type:

List[torch.Tensor, torch.Tensor]

remove_var(tensor, indexes_to_exclude, dimension)[source]

Function to remove variables from tensors in chosen dimension and position

Parameters:
  • tensor (torch.Tensor) – starting tensor

  • indexes_to_exclude (list) – index of the chosen dimension we want t oexclude

  • dimension (int) – dimension of the tensor on which we want to work (not list od dims!!)

Returns:

new tensor without the chosen variables

Return type:

torch.Tensor

class dsipts.models.TIDE.ResidualBlock(in_size, out_size, dropout_rate, activation_fun='')[source]

Bases: Module

Residual Block as basic layer of the archetecture.

MLP with one hidden layer, activation and skip connection Basically dimension d_model, but better if input_dim and output_dim are explicit

in_size and out_size to handle dimensions at different stages of the NN

Parameters:
  • in_size (int)

  • out_size (int)

  • dropout_rate (float)

  • activation_fun (str, optional) – activation function to use in the Residual Block. Defaults to nn.ReLU.

__init__(in_size, out_size, dropout_rate, activation_fun='')[source]

Residual Block as basic layer of the archetecture.

MLP with one hidden layer, activation and skip connection Basically dimension d_model, but better if input_dim and output_dim are explicit

in_size and out_size to handle dimensions at different stages of the NN

Parameters:
  • in_size (int)

  • out_size (int)

  • dropout_rate (float)

  • activation_fun (str, optional) – activation function to use in the Residual Block. Defaults to nn.ReLU.

forward(x, apply_final_norm=True)[source]

Define the computation performed at every call.

Should be overridden by all subclasses.

Note

Although the recipe for forward pass needs to be defined within this function, one should call the Module instance afterwards instead of this since the former takes care of running the registered hooks while the latter silently ignores them.