Got here from this article.
Firstly thank you so much about your work, giving insight for time series analysis beginners like me.
However, in the definition of TransformerEncoder, I found in forward() function, the residual connection is defined like this:
ff_layer = self.ff_normalize(x[0] + ff_layer)
According to the original paper (Attention is all you need), the residual connection should be defined in this way:
ff_layer = self.ff_normalize(attn_layer + ff_layer)
Any possibility that you could provide more explanation about the intuition behind this discrepancy?
Got here from this article.
Firstly thank you so much about your work, giving insight for time series analysis beginners like me.
However, in the definition of
TransformerEncoder, I found inforward()function, the residual connection is defined like this:According to the original paper (Attention is all you need), the residual connection should be defined in this way:
Any possibility that you could provide more explanation about the intuition behind this discrepancy?