You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In TransformerForSequenceClassification, x = self.encoder(x)[:, 0, :] means [CLS] token is included in the inputs. However, in the beginning of this chapter, inputs is defined as tokenizer(text, return_tensors="pt", add_special_tokens=False), without special_tokens. Hence, the 0-th is "time", not "[CLS]".
The text was updated successfully, but these errors were encountered:
S3nnyK
changed the title
Fault about TransformerForSequenceClassification
Chap 3: Fault about TransformerForSequenceClassification
Aug 9, 2024
Information
The question or comment is about chapter:
Question or comment
In TransformerForSequenceClassification, x = self.encoder(x)[:, 0, :] means [CLS] token is included in the inputs. However, in the beginning of this chapter, inputs is defined as tokenizer(text, return_tensors="pt", add_special_tokens=False), without special_tokens. Hence, the 0-th is "time", not "[CLS]".
The text was updated successfully, but these errors were encountered: