Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

History scraping - a:visited and timing attacks [1477773 for starters] #448

Closed
vertigo220 opened this issue Jun 16, 2018 · 64 comments
Closed

Comments

@vertigo220
Copy link

So I had initially done some research on this pref and was under the belief they fixed the issue, but I've now learned from the user.js that it's still flawed. I tried out the PoC (which I've found to be accurate) and was disappointed (though, sadly, not surprised) to see such a hole has been left exposed by Mozilla. Personally, I make heavy use of the different coloring between non-visited and visited links in my browsing, and I'd really rather not disable this pref, and considering this method is probably not commonly used I don't think I will. But I'm trying to figure out a way to defeat the method. When I learned how it works, by throwing asteroid links of different visibility, based on the css of those links, which is based on whether you've visited them, at you, I came up with an idea, and it sort of works.

If you go into options > content > fonts & colors > colors, and set it to always override the page colors, it allows you to see all the asteroids, which will make the PoC think you've visited all the sites (and actually appears to "break" the "game," since it just keeps going and going). And, of course, I would assume that if a site were displaying multiple links with only the ones related to links you've visited being visible, it would cause all of them to show, which would alert you to what's going on. The problem is that setting it to always override colors really messes sites up, mainly because you can't tell it to only override link colors, so it also applies text and background colors. Even a basic page like AMO reviews gets completely trashed.

So my next thought was using the Visited Lite script, which colors visited links your preferred color, to also color non-visited ones. After the developer told me how to do this, I tried it and, unfortunately, it didn't work. I'm not sure if it's because of how it changes the color, or maybe when (perhaps it does it too late, being a script). So I'm now wondering if there's some way, maybe through userChrome.css, to do this, which might work at a lower level and prove effective at this. I don't have any experience with userChrome.css, so hopefully someone that does and knows how to do this could give it a try and report back. Or maybe someone has another idea.

One final note: I also just tried blocking css in uMatrix, but it didn't work, either.

@Thorin-Oakenpants
Copy link
Contributor

I don't know about the PoC (and this is just one example, there are/were other PoCs, so they may differ) - but at this stage I would assume it checks the color - default blue = not visited, anything else = visited. Hence why when you set two different colors, it's tricked into thinking you visited all of them (and likewise with no color change for visited links, at default, it thinks you visited none).

It doesn't matter, AFAIK, if you set this via FF or scripts, its the color attribute that's being read (right?). I know you're looking for a solution so you can have two colors - but I think it's more trouble than it's worth and will visually break way too many sites

The pref is not a 100% sure fix, but it does work in that PoC. The only good news I can offer (because I can tell you now, nothing more will be done - the outstanding issues are years and years old now), is that FPI (first party isolation) might handle this - i.e:

  • When on SiteA you click on a LinkX, the link will now be color:visited
  • When on SiteB, the LinX will be color:unvisited

This is still not a perfect fix, because IMO, this still allows tracking on a first party basis. And in order to defeat that (eg TC in a hardened mode), you then lose "visited" links history (I think, not sure where it's kept, probably history - TC clears history, not sure if an option or built in).

But depending on your needs, isolating visited links by 1st party, may suit you. A first party (with JS) can track mouse clicks anyway (and if you pause over links to see their URL) - have a play here - https://clickclickclick.click/ .

@Thorin-Oakenpants
Copy link
Contributor

^^ Also: using two different colors from default might make you more fingerprintable

@vertigo220
Copy link
Author

I would assume it checks the color - default blue = not visited, anything else = visited. Hence why when you set two different colors, it's tricked into thinking you visited all of them (and likewise with no color change for visited links, at default, it thinks you visited none).

It's impossible to tell, since the script doesn't work on it (again, I assume it's because there's a delay before the script changes the colors), but I normally have a different color for visited links than the default (using the script), because the defaults look so close together, and also I was hoping it was helping avoid some of this crap. Anyways, when changing the Firefox setting to always (vs only with high-contrast themes) use the colors, it causes the background of the game to be white instead of black and it causes all the asteroids to be visible, and they're whatever color you have set in the Firefox options. So it may be that only the exact color that's the default (dark blue, #0000EE) is hidden, or it could have something to do with the change of the background color, or a combination of them. But if the default color of unvisited/nonvisited links could be changed, it would allow that to be tested, and may provide the same result as changing the color options, but without causing such significant and destructive changes. Unfortunately, doing so with the script doesn't work, and I don't know if that's because the theory is flawed or, what seems much more likely, because of how the script changes the color. After all, if I load a page, it takes about half a second for the links to change color, due to a delay before the script is loaded and takes action. Changing the color options in Firefox causes it to happen on load, not after, and I strongly suspect that's what causes the PoC to not work. That's why I'm wondering if there's some way to change just the links colors "always" but not affect the background and text colors and whatever else that setting affects. Even changing that bit of code (or the code that determines the default coloring) in Firefox and recompiling it would at least allow for testing and, if it proves successful maybe a change to the code to add a pref to change the color could provide a means of resisting this particular method. Another thought on this: while I'd still like to try changing the color, I actually wonder if it does have to do with the change in background, since changing the color setting also causes the spaceship to disappear (which shouldn't be related to link coloring), and the asteroids all become visible even when the link colors aren't changed from the default. So maybe that's a clue to some other potential "fix."

I know you're looking for a solution so you can have two colors - but I think it's more trouble than it's worth and will visually break way too many sites

Not sure what you mean by breaking sites. They normally have two different colors for visited vs unvisited links, and using different colors than the default doesn't break them, just makes them look a bit different. The closest thing to "breakage" I've experienced with using the script to change the visited links colors is that some links that normally wouldn't change color, because they're not normal "links," per se, (like "Pull requests" and "Issues" at the top of this page, e.g., though interestingly those aren't affected) are a different color and stand out, but that's minor.

using two different colors from default might make you more fingerprintable

True, but a) I use a different visited color anyway, as explained above, and b) while I'm all for resisting fingerprinting as much as reasonably possible, I also personally believe there are so many factors that, even when we do our best, we'll still be anywhere from fairly to highly unique, and I don't want to make it harder for me to use the web (by having visited and unvisited links look similar) just to possibly reduce my uniqueness a little bit, and then still be identifiable anyways. I mean, if it were the difference between being anonymous and not, that would be one thing, but so far, it's not even close.

@Thorin-Oakenpants
Copy link
Contributor

was wondering WTF you're going on about background colors .. now I see in the options that the four color overrides have a when to apply setting - Always, Only with High Contrast..., Never. And since when you use Always it always applies those 4 colors

IDK - that should only override elements that don't have a color attribute. I'd have to muck around and play with it to understand what's going on

@Thorin-Oakenpants
Copy link
Contributor

@vertigo220
Copy link
Author

I've read about timing attacks before, though not quite as in-depth. Unfortunately, until Mozilla actually takes some major steps to improve things (which is highly unlikely considering many issues have existed for years), and possibly even if they finally do, since nothing is ever 100%, history is going to be able to be exploited. Considering I need it for my own use, and therefore completely disabling it isn't an option, nor is disabling the link css, the best I can do is just try to figure out workarounds to try and defeat the various attacks. Interestingly, the one on that site said I visited a few sites that I haven't, and that aren't in my history. So it seems to not be very accurate. And I didn't do anything while it ran.

@Thorin-Oakenpants
Copy link
Contributor

Some high precision timing attacks can be unique (clock skew vs server time?), but others (like this one) only need a difference. IANAnExpert, but regards timing

  • there were timing mitigations put in place for spectre
  • RFP does timing issues
  • RFP & spectre mitigations (I think) handle these timing issues at a higher (or is that lower) level so they should cover most/every scenario
  • in the past there have been timing leaks that no one thought of
  • I opened a new issue ( reminder: dom.enable_performance_navigation_timing #457 ) on timing stuff, there are a bunch of prefs to do with timing, at least one is not in the user.js
  • I am watching TBB based on ESR60 to see what they do with these prefs

which is highly unlikely considering many issues have existed for years

For years Mozilla had to battle XUL. I read a comment once from a Mozilla code dude who outlined what some changes took due to breakage - it was like 18 months to 2 years to fully implement some things due to breakage. Now its much easier, and changes are taking place at a paid pace. I guess that was more for things that extensions used. And like any organization (I have not worked in large software projects, the most ever was 4 of us, and that was back in 1823), work has to prioritized, etc - some bugs block others, etc. I think Mozilla is one of the most responsive teams around.

Interestingly, the one on that site said I visited a few sites that I haven't, and that aren't in my history. So it seems to not be very accurate

Are you using RFP? Maybe it's the spectre mitigations. IDK. I linked the article cuz it was interesting. That article is very recent. Maybe I should ask the author to join in here. I have emailed Antoine and asked him to join in here

I know you want/need the visited link to be different, so I wasn't proposing it to be a solution - but I do have questions, which will be in the next post

@Thorin-Oakenpants
Copy link
Contributor

Thorin-Oakenpants commented Jul 4, 2018

Questions for Antoine @antoinevastel

The article says

Moreover, whenever a script tries to access to the color of a link by using Javascript, it always returns the color associated with unvisited links. Thus, it is not possible for an attacker to distinguish between visited and unvisited urls.

Is this due to the changes in Gecko2 ( https://developer.mozilla.org/en-US/docs/Web/CSS/Privacy_and_the_:visited_selector ) or only when the user has the pref layout.css.visited_links_enabled=false ? My understanding was the changes are tied behind the pref, and the solution is not perfect

From @vertigo220

Interestingly, the one on that site said I visited a few sites that I haven't, and that aren't in my history. So it seems to not be very accurate

I haven't tried the test myself. Is this test new, devised since spectre mitigations in browsers were implemented? How accurate is it meant to be? Maybe we could also test results with/without privacy.resistFingerprinting.

@Thorin-Oakenpants
Copy link
Contributor

@vertigo220 Just a thought .. what would a color change on hover do? i.e turn the pref=true so all links are the same color, and then script that modified a:visited links hover attributes?

@antoinevastel
Copy link

@Thorin-Oakenpants Concerning your first question, it is indeed due to the link you've quoted, and not only when the user has the pref in particular.
I think the demo on my website doesn't work anymore (I tried on 2 different computers/browsers and it didn't work).

@Thorin-Oakenpants
Copy link
Contributor

Thanks for dropping by @antoinevastel 👍 .. love your work in #458 btw

@vertigo220
Copy link
Author

I don't currently use RFP, but have most timing prefs disabled. As for only modifying link color on hover, it would certainly be better than nothing, and would possibly be a solution if I were super paranoid about history scraping, but it would still be a substantial usability hit for me, and I worry more about this particular issue due to principal than actual concern.

@Thorin-Oakenpants
Copy link
Contributor

Concerning your first question, it is indeed due to the link you've quoted, and not only when the user has the pref in particular.

I do not think is right. The PoC (linked in OP) accurately reveals previously visited sites when the pref is not used. When the pref is used, the PoC fails.


From our wiki

2 This test is a PoC (proof of concept). You will need layout.css.visited_links_enabled set as true. You will also need a normal window (not a Private Browsing one). The PoC only covers a handful of sites, and many of those will not "leak" as the code is checking HTTP and the site has moved to HTTPS - i.e the full URL has changed. For best results:


Its an ancient PoC and I will need to go retest if any sites still exist that use HTTP

@Thorin-Oakenpants
Copy link
Contributor

Thorin-Oakenpants commented Jul 5, 2018

I think we need a better visited color PoC. I did some tests

SITE LIST

no change
- http://www.foxnews.com/
- https://news.ycombinator.com/
- https://twitter.com/
- https://www.bankofamerica.com/
- https://www.wellsfargo.com/

is now HTTPS
- http://en.wikipedia.org/wiki/Opossum
- http://reddit.com/r/conspiracy
- http://slashdot.org/
- http://www.amazon.com/
- http://www.bestbuy.com/
- http://www.bing.com/
- http://www.cnn.com/ - redirects to https://edition.cnn.com/
- http://www.diapers.com/ - redirects to amazon.com (HTTPS)
- http://www.ebay.com/
- http://www.facebook.com/home.php - redirects to https://www.facebook.com/login*
- http://www.mercurynews.com/
- http://www.nsa.gov/
- http://www.petfinder.com/
- http://www.pizzahut.com/ - redirects for ME to https://www.pizzahut.MYCOUNTRYTLD/
- http://www.playboy.com/
- http://www.scroogled.com/ - redirects to microsoft.com (HTTPS)
- http://www.walmart.com/

no longer exists
- http://timecube.com/

TEST 1

  • layout.css.visited_links_enabled=true
  • go to every single site in the list (exclude timecube.com)
  • play game
  • result: says I've been to every single site

TEST 2

  • clear all history play the game again
  • this was a never-nding game (got to 20,000!), I literally had 1 item in history and that was the test site itself
  • redo but visit a couple of non listed sites so history is bigger
  • play the game again and it ends very quickly
  • results: says I visited none of the sites

TEST 3

  • clear everything (ctrl-shift-del)
  • visit ONE site in the list, eg fox news
  • results: told me I had visited 9 of the sites (includes fox news) which is bollocks

TEST x

  • flip pref to false
  • clear everything (ctrl-shift-del)
  • visit a site, eg fox news
  • play game again (note with the pref at false you can't see the asteroids)
  • result: only told me it was game over

Conclusion: Test requires at least a history of two items. When the test lists sites visited, it always included the non-existent timecube.com, and I think the logic in the code causes other false positives. The original list of sites that don't change is down to four, and only one of those is http - foxnews.com

I know this test worked in the past, but it now seems a bit broken. Surely someone (I lack the skills) could code up a simple list of ten popular HTTPS sites and list them, and use JS code to display if the color indicates it was visited (although Antoine says this is not possible). @earthlng - you love this stuff man!

@vertigo220
Copy link
Author

I didn't do quite as extensive testing of it, but when I tried the PoC it seemed pretty accurate for me. Like you, the game went on forever at one point, but in my case I think that was when I changed the color settings in Firefox's options, so I could see all asteroids. But running it normally seemed to be able to tell quite closely which sites I had visited. I agree though that having most of the sites as http and having a defunct site in the list certainly diminishes its use.

@Thorin-Oakenpants
Copy link
Contributor

Thorin-Oakenpants commented Jul 5, 2018

I still think that the pref is required to hide the a:visited attributes (edit: or at least the color). For me I don't really care - since we enforce that pref in the user.js. I'm more concerned about the truth and if it still applies to you vertigo :) The test still seems to indicate that the pref is required, but Antoine seems to think otherwise. Wish we had a decent PoC

And I want the FPI to apply to visited as well, which should at least eliminate third parties and reduce the exposure for those who want a:visited colors

@earthlng
Copy link
Contributor

earthlng commented Jul 6, 2018

@vertigo220
Copy link
Author

Thanks for making that, earthlng! It's not completely accurate, though. Said I've visited playboy.com (I haven't) and that I haven't visited ghacks.net and torproject.org (I have). Also, the start button is only half visible (bottom half is off the page) and the page doesn't scroll.

@Thorin-Oakenpants
Copy link
Contributor

Thorin-Oakenpants commented Jul 6, 2018

https://github.com/earthlng/testpages/blob/master/docs/visited_links.html

slashdot is using http .. should be https?

@Thorin-Oakenpants
Copy link
Contributor

Thorin-Oakenpants commented Jul 6, 2018

I have no problems with the layout or scrolling etc. But the test is definitely dodgy (just to be clear that is not earthlng's fault, he just fixed the urls we use)

in a nilla profile with layout.css.visited_links_enabled;true, any three sites I visit at random from the list (I stayed away from http fox news and slashdot which I think should be https) get picked up, but so do any three others - every time
yeah-nah

I did this three times, each time was three random sites. Didn't try four to see if it told me 8 sites. Maybe theres a flaw in the logic?

However, I did try 2 random sites, did this 3 times as well, and the game never finishes.

@Thorin-Oakenpants
Copy link
Contributor

html code

<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">

<head>
<title>visited links</title>
</head>

<body>
<h2>list of hyperlinks</h2>

<ul>

<li><a href="https://www.amazon.com/">https://www.amazon.com/</a></li>
<li><a href="https://www.playboy.com/">https://www.playboy.com/</a></li>
<li><a href="https://www.reddit.com/r/conspiracy/">https://www.reddit.com/r/conspiracy/</a></li>
<li><a href="https://twitter.com/">https://twitter.com/</a></li>
<li><a href="https://theintercept.com/">https://theintercept.com/</a></li>
<li><a href="https://www.rt.com/">https://www.rt.com/</a></li>
<li><a href="https://edition.cnn.com/">https://edition.cnn.com/</a></li>
<li><a href="https://www.presstv.com/">https://www.presstv.com/</a></li>
<li><a href="https://www.mercurynews.com/">https://www.mercurynews.com/</a></li>
<li><a href="https://en.wikipedia.org/wiki/Opossum">https://en.wikipedia.org/wiki/Opossum</a></li>
<li><a href="http://www.foxnews.com/">http://www.foxnews.com/</a></li>
<li><a href="https://www.bing.com/">https://www.bing.com/</a></li>
<li><a href="https://www.bankofamerica.com/">https://www.bankofamerica.com/</a></li>
<li><a href="https://www.bestbuy.com/">https://www.bestbuy.com/</a></li>
<li><a href="https://thepiratebay.org/">https://thepiratebay.org/</a></li>
<li><a href="https://www.ghacks.net/">https://www.ghacks.net/</a></li>
<li><a href="http://slashdot.org/">http://slashdot.org/</a></li>
<li><a href="https://www.petfinder.com/">https://www.petfinder.com/</a></li>
<li><a href="https://www.torproject.org/">https://www.torproject.org/</a></li>
<li><a href="https://www.walmart.com/">https://www.walmart.com/</a></li>
<li><a href="https://www.ebay.com/">https://www.ebay.com/</a></li>
<li><a href="https://news.ycombinator.com/">https://news.ycombinator.com/</a></li>
<li><a href="https://www.wellsfargo.com/">https://www.wellsfargo.com/</a></li>

</ul>

<h2>add button here</h2>

button runs JS to evaluate link colors or<br>
whatever it was the old PoC was trying to do

</body>
</html>

Either a new PoC or the current one, we could add the list of urls so we can see them. The pic is a nilla profile, all the links, even the HTTP slashdot one gets recognised as visited.

visted

Just need to do the visited JS bit and the popup message

@Thorin-Oakenpants
Copy link
Contributor

I'm not a JS junkie, but I would assume a script could cycle elements, take each a href which has text as the anchor, and extract the color. You would then get (in this test) one or two colors (depending on whether you had visited all, some or none of the links).

This is not perfect. What about hyperlinks that are not text, what about hyperlinks that are colored by css. I probably have it all wrong as how this is meant to get extracted. Would be nice to have a working PoC which we could also use to test when FPI is applied to a:visited

@earthlng
Copy link
Contributor

earthlng commented Jul 6, 2018

I get pretty accurate results. Visited 3 links and it correctly reported all 3 plus 1 that I hadn't visited.
All the asteroids come in from a similar angle and it's possible that you accidentally click on an invisible one. I could slow it down which should probably reduce the false-positives but it will take longer to finish the game.

@Thorin-Oakenpants
Copy link
Contributor

Thorin-Oakenpants commented Jul 6, 2018

Oh .. are you only meant to click on visible ones? I just click like a mad bastard cuz I have ADD (PS: used to do a lot of gaming back in the day .. until I took an arrow to the knee)

@Thorin-Oakenpants
Copy link
Contributor

I'll reduce the ☕ intake, and do some controlled refined clicking shortly and post back

@Thorin-Oakenpants
Copy link
Contributor

OK, I did some surgeon like precision laser work - 3 tests

  1. got 4 out of 4 but added a false-positive
  2. got 3 out of 4 (missed fox news) but added a false-positive
  3. got 3 out of 4 (missed fox news)

I think the fox news one might be broken (is it checking https?), and the false positives are invisible ones. Can you tweak the code so invisible ones are bright green and visible ones are bright red or something? I don't get the point of the silly game- just enumerate it :)

@earthlng
Copy link
Contributor

earthlng commented Jul 7, 2018

I played around with timing based history stealing and the results are surprising.
My script can accurately detect any page in your history if I check for it. RFP doesn't stop it or slow it down, it works across containers and layout.css.visited_links_enabled can't stop it either.
It even works in private windows but can only detect sites already in your history because history is not recorded in private windows.

@Thorin-Oakenpants
Copy link
Contributor

^^ but that's the timing one, not the visited color, right! That said, if a decent timing attack can do it, the colors don't matter. Do you want to email me / share - or let the Tor/Mozilla guys know? We don't have to make your code public (yet), but damn I want in on this test, quick stat

@earthlng
Copy link
Contributor

earthlng commented Jul 7, 2018

Yeah I should probably report this to mozilla but they have a tendency to ignore all my reports and this might just be another one of those. TBB is not affected because they use PB-mode

@crssi
Copy link

crssi commented Jul 7, 2018

@earthlng
Even when

/* 0862  */ user_pref("places.history.enabled", false);

?

@Thorin-Oakenpants
Copy link
Contributor

This is limited to global history, right? The only reason I use history is so I can have a little give and take in per tab history with workflow for ONE site I administer.

Might start looking at an extension to wipe history after a domain tab close, or all history older than x minutes on a timer, or something

@Atavic
Copy link

Atavic commented Aug 20, 2018

This is limited to global history, right?

Yeah.

@Thorin-Oakenpants
Copy link
Contributor

@earthlng

leaving RFP=true, and leaving HWA=on and dom.event.highrestimestamp.enabled at default true

can you set the following prefs and retest
dom.enable_performance_navigation_timing=false
dom.enable_performance_observer=false
dom.enable_performance=false
dom.enable_resource_timing=false

@tomrittervg
Copy link

Disabling high resolution timers won't stop the attack, even if it breaks the POC. One can build high res timers out of other web features.

I'm working on fuzzyfox to mitigate that, but it will be very experimental when it lands.

@earthlng
Copy link
Contributor

@Thorin-Oakenpants I tested with those settings and it makes no difference ie the PoC still works.

@tomrittervg you guys already made it so that RFP=true returns a max precision of 100ms for a lot of different timing things but the function used in the PoC is not affected by that. If you could change that then at least RFP=true users would be (somewhat) protected from this attack. ("somewhat" because it would probably still be possible but way to slow to be practical except for very targeted attacks ie "did user 'russianbot' visit comrades.kremlin.ru?" ;)

@Thorin-Oakenpants
Copy link
Contributor

@Thorin-Oakenpants
Copy link
Contributor

Hmmm, so stupid question time: privacy.resistFingerprinting.reduceTimerPrecision.microseconds currently has a value of 1000. Is this the value that determines the 100ms (I do not know what a microsecond is)? And raising it would lower entropy?

note: I have no intention of changing it, just asking
yup: I know (now) that this does not affect the PoC

@earthlng
Copy link
Contributor

https://bugzilla.mozilla.org/show_bug.cgi?id=1217238#c111

If privacy.resistFingerprinting is enabled, the precision is 100ms or the value of privacy.resistFingerprinting.reduceTimerPrecision.microseconds, whichever is larger.

following the STR in https://bugzilla.mozilla.org/show_bug.cgi?id=1217238#c107 it looks like that's still the case in my FF62

@Thorin-Oakenpants
Copy link
Contributor

all over my head: here's another timing attack: https://blog.sheddow.xyz/css-timing-attack/ ... I will just assume Fuzzy Fox and/or RFP/Spectre timing mitigations in FF are enough.

@tomrittervg link FYI just in case there is/was no bugzilla

@Atavic
Copy link

Atavic commented Oct 15, 2018

A microsecond 1/1000 of a millisecond:
a privacy.resistFingerprinting.reduceTimerPrecision.microseconds value of 1000 means one millisecond.

Here there's a
Reduced time precision section that explains it.

DOMHighResTimeStamp API can go down to 5 µs: a tracking heaven.

@Atavic
Copy link

Atavic commented Oct 15, 2018

See: w3c/hr-time#56

@earthlng
Copy link
Contributor

earthlng commented Oct 16, 2018

here's another timing attack: https://blog.sheddow.xyz/css-timing-attack/

in its current form it doesn't really work when RFP is enabled. There's a very small chance that it can get a character correct even with RFP enabled but to brute-force the full string would take way too long. There's a much higher chance that you'll have left the site again before it can even get the 2nd or 3rd char.

@Thorin-Oakenpants
Copy link
Contributor

Thorin-Oakenpants commented Oct 27, 2018

FYI: FuzzyFox has landed. Edit: this is early stages experimental and nowhere near using in your everyday profile (currently of course it's only in Nightly), but anyway, now you've been warned

@tomrittervg
Copy link

@Thorin-Oakenpants It has, yes; but this is different than Resist Fingerprinting and First Party Isolation. This is so far down the scale of 'experimental' I'm probably going to need to follow-up and restrict it to Nightly only or some other type of safety switch. Enabling this pref will basically make your profile unsuable, so please don't encourage people to enable it or enable it for others. We landed this to provide a base for performance testing and experimentation; which we'll be doing over the coming months.

@Thorin-Oakenpants
Copy link
Contributor

^^ Yup. I mentioned it in case anyone (eg @earthlng ) wanted to play with it with his PoCs etc. Certainly wouldn't flip it until it was "stable" - I'll amend my post above to make it clear

@Thorin-Oakenpants
Copy link
Contributor

Thorin-Oakenpants commented Nov 3, 2018

@earthlng @tomrittervg

Link: https://www.usenix.org/system/files/conference/woot18/woot18-paper-smith.pdf

edit: ouch (pdf) [CSS Paint which is currently chrome only, since fixed] :

Our amplified attack can probe a user’s browsing history at 3,000 URLs per second without the
victim noticing, i.e., we can scan Alexa Internet’s list of Top 100,000 Websites in 30-40 seconds — in the back-ground, with no visible effect on the page, and with no interaction required from the victim

edit2:
ruh-roh

@Thorin-Oakenpants
Copy link
Contributor

Section 5, page 10 (emphasis mine)

Firefox with visited links disabled.Turning off Firefox’s layout.css.visited links enabled configuration flag should eliminate visited link styling altogether. Not so: disabling the flag fails to block either our visited-link attacks or Paul Stone’s older one; we reported this bug to Mozilla.

What's that bugzilla?

@tomrittervg
Copy link

Section 5, page 10 (emphasis mine)

Firefox with visited links disabled.Turning off Firefox’s layout.css.visited links enabled configuration flag should eliminate visited link styling altogether. Not so: disabling the flag fails to block either our visited-link attacks or Paul Stone’s older one; we reported this bug to Mozilla.

What's that bugzilla?

https://bugzilla.mozilla.org/show_bug.cgi?id=1477773

@Thorin-Oakenpants
Copy link
Contributor

@tomrittervg ^^ Since that was resolved fixed in FF63, did it mitigate any of the attacks? earthlng said:

nvm, the PoC still works. The only good news is that it seems disabling HW acceleration slows it down to a point where the attack is no longer really practical

Keep in mind that earthlng is talking about his PoC, not the one(s) in the bugzilla, but they are probably the same(?)

@Thorin-Oakenpants
Copy link
Contributor

closing this. Mozilla are aware of the latest timing & side-channel attacks, and FPI will most likely be applied to a:visited. And there's always FuzzyFox 🦊

@Thorin-Oakenpants
Copy link
Contributor

Thorin-Oakenpants commented Apr 24, 2020

I'm get excited ... https://bugzilla.mozilla.org/show_bug.cgi?id=1632765 - edit: the ticket is very boring: the actual mechanics of it are access denied but if you want to, like me months ago, you can trace what they did via mercurial, and how it works

PS: ticket created 27 minutes ago ...I'm on the pulse guys .. fucking nailing it!!

@crssi

This comment has been minimized.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Development

No branches or pull requests

7 participants