Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

localStorage expiresIn out of sync with JWT exp #1259

Closed
1 of 5 tasks
unindented opened this issue Jan 30, 2020 · 2 comments · Fixed by #1395
Closed
1 of 5 tasks

localStorage expiresIn out of sync with JWT exp #1259

unindented opened this issue Jan 30, 2020 · 2 comments · Fixed by #1395
Assignees
Labels
bug A problem that needs to be fixed for the feature to function as intended. known-issue Issue is already known and is either being investigated or is already fixed.
Milestone

Comments

@unindented
Copy link

Library

  • msal@1.2.1
  • @azure/msal-browser@2.x.x
  • @azure/msal-angular@0.x.x
  • @azure/msal-angular@1.x.x
  • @azure/msal-angularjs@1.x.x

Framework

No framework.

Description

We’re seeing instances where MSAL's expiresIn value stored in localStorage differs quite a bit from the exp value in the JWT token (sometimes in the minutes, sometimes in the hours/days range).

If a user gets into a state where exp is in the past, and expiresIn is in the future, calling acquireTokenSilent will consistently give you an expired token (cause it checks the value in expiresIn, and not the value in exp), so there's no way for a user to get out of that state without manually deleting localStorage.

The root cause is that exp value is calculated on the server, whereas expiresIn is calculated after-the-fact on the client, using their system clock.

We've seen the biggest difference between the two values when users put our tab in the background, or their computers to sleep. Chrome seems to throttle/pause code execution and/or network requests, and if that happens between the time you get a token back from the server, and the time when expiresIn value is calculated, you can get huge drift. We had a user whose JWT exp value was a Friday, and his localStorage expiresIn value was a Monday.

This is where MSAL.js relies on the system clock:

Security

No.

Regression

No. This problem is even present in ADAL.js. Here's where ADAL.js relies on the system clock: https://github.com/AzureAD/azure-activedirectory-library-for-js/blob/de375da0f438018f01c6902f6dcfaedb24f91320/lib/adal.js#L1661

Configuration

{
  auth: {
    clientId: "e25ff63c-2374-438a-9fbe-91cd5ee67bae",
    authority: "https://login.microsoftonline.com/organizations",
    redirectUri: "http://localhost:8080/"
  },
  cache: {
    cacheLocation: "localStorage",
    storeAuthStateInCookie: true
  }
}

Reproduction steps

  1. Tweak the quickstart example to acquire tokens on a setInterval
  2. Clear localStorage periodically, so that you force the server to send you a new token
  3. Mess around with network download latencies, and see the drift
  4. Mess around with the system clock, and see the drift
  5. Put your computer to sleep at just the right time (kinda hard), and see the huge drift

Expected behavior

No drift, or a way to get out of this scenario.

  • No drift: This would require dropping expiresIn, and just relying on JWT's exp value. I hear you're working on encrypting tokens, so this may not be possible.
  • A way to get out of this scenario: If MSAL thinks a token should be valid, but our code is getting 401s consistently from our APIs, we should have a way of skipping MSAL's cache, and having it give us a fresh token.

Browsers

We see it happen relatively frequently with Chrome, but I see no reason why it wouldn't affect other browsers.

@unindented unindented added the bug A problem that needs to be fixed for the feature to function as intended. label Jan 30, 2020
@jasonnutter jasonnutter self-assigned this Jan 30, 2020
@AndreiZe
Copy link

AndreiZe commented Feb 5, 2020

Excellent find, Daniel! We recently ran into the same issue. It happens pretty rarely but for our app it results in pretty bad user experience. Hopefully this is treated with high priority, looking forward to the patched version.

@jasonnutter
Copy link
Contributor

POC fix in PR referenced above, feedback welcome.

@jasonnutter jasonnutter added the known-issue Issue is already known and is either being investigated or is already fixed. label Mar 30, 2020
@github-actions github-actions bot locked as resolved and limited conversation to collaborators Sep 2, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
bug A problem that needs to be fixed for the feature to function as intended. known-issue Issue is already known and is either being investigated or is already fixed.
Projects
None yet
3 participants