You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
In my project, I am using TCN for sequence-to-sequence analysis of time series data that have variable lengths. I have defined a subclass of the Sequence class that pads each batch of data to its maximum sequence length (similar to what is suggested here). As for the model, I use a masking layer to compute and pass a mask to TCN (as suggested here issue #234). Supposedly, layers that support masking will automatically propagate the mask to the next layer. In the simplest form of my model, I have a masking layer, followed by a TCN, and a Dense layer with 1 unit.
Here are two issues that I've got:
when I try to access the propagated mask from the output of the TCN layer, I get an error that says the object has no attribute _keras_mask.
Apparently, it matters if a sequence is padded from the beginning of the sequence or a the end (whether to set padding argument of the pad_sequences 'pre' or 'post'). If it's padded from the beginning, the output of the TCN at those padded time steps is equal to zero. But you cannot expect zero output if it's padded at the end of the sequence.
Paste a snippet
Please see the following simple code:
@fsbashiri thanks for reporting! I propose an explanation. I'm not 100% sure, you can challenge me.
My clue is the TCN works a bit like an RNN even though it has no states like a LSTM would have.
The last outputs depend on the end of the sequence but also on the beginning.
If you use post, the end is padded with zeros. The last outputs will be non zeros because they also depend on the beginning of the sequence, which contains values (non zeros).
Conversely, the first outputs only depends on the beginning of the sequence. If you use pre, the beginning will be padded with 0 and the first outputs will be 0.
For your second point _keras_mask, I guess we should not directly try to call it. But it's strange if it does not exist. Does it exist for other Keras layers that support masking? Maybe we should add it somewhere in the layer because it's not inherited from the Layer object. I don't know.
Describe the bug
In my project, I am using TCN for sequence-to-sequence analysis of time series data that have variable lengths. I have defined a subclass of the Sequence class that pads each batch of data to its maximum sequence length (similar to what is suggested here). As for the model, I use a masking layer to compute and pass a mask to TCN (as suggested here issue #234). Supposedly, layers that support masking will automatically propagate the mask to the next layer. In the simplest form of my model, I have a masking layer, followed by a TCN, and a Dense layer with 1 unit.
Here are two issues that I've got:
_keras_mask
.Paste a snippet
Please see the following simple code:
The output of the code:
Dependencies
I am using:
keras 2.4.3
keras-tan 3.1.1
Tensorflow-gpu 2.3.1
The text was updated successfully, but these errors were encountered: