Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

What is the meaning of token length in Normalization? #1

Closed
EricLina opened this issue May 29, 2024 · 2 comments
Closed

What is the meaning of token length in Normalization? #1

EricLina opened this issue May 29, 2024 · 2 comments

Comments

@EricLina
Copy link

Hi there!
Excellent job on your work!
Could you please explain the concept of 'longer token' in this context?
image

@tian-qing001
Copy link
Collaborator

Hi @EricLina, thanks for your interest in our work.
In this context, a token $x_i$ is actually a vector in $\mathbb{R}^{1\times d}$ space. We consider a token $x_i$ with larger norm $\Vert x_i \Vert$ to be a longer token. For example, $2x_i$ is longer than $x_i$ when $x_i\neq 0$.

@EricLina
Copy link
Author

Thanks for your reply.

@LeapLabTHU LeapLabTHU pinned this issue Jun 19, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants