-
Notifications
You must be signed in to change notification settings - Fork 113
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Vulnerability in bigfile.py #13
Comments
Original comment by Sybren Stüvel (Bitbucket: sybren, GitHub: sybrenstuvel): If you can help me fix this, please let me know. Otherwise I'll just add a note to the functions about their insecure nature. |
This attack is based on timing. It seems like this could be fixed by attempting to decrypt all blocks even if one raises an error. |
This was referenced Dec 14, 2022
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Originally reported by: Sergio Lerner (Bitbucket: Sergio_Demian_Lerner, GitHub: Sergio_Demian_Lerner)
There is a security vulnerability in decrypt/encrypt_bigfile(infile, outfile, pub_key). Depending on the way decrypt_bigfile() is called, it may be possible to do a Bleichenbacher attack.
First note that:
We'll force decrypt_bigfile() to implements a perfect Bleichenbacher oracle. The infile (a message) is broken into blocks, and each block is independently encrypted using RSA. That means that an attacker can reorder blocks within a message and still create a valid message. Also the attacker can construct a new message by mixing blocks from other captured messages.
Attack:
If the PKCS#1 1.5 padding of the first block is incorrect, then the function decrypt_bigfile() will fail fast. If the padding is correct then the remaining blocks will be checked and that will take additional externally measurable time.
I found a way to implement this attack and break the Bitmessage protocol (Bitmessage.org) which uses decrypt_bigfile(). See http://bitslog.wordpress.com/
The text was updated successfully, but these errors were encountered: