This repository contains an implementation of the Vectorized Batch Private Information Retrieval (PIR) Protocol published in IEEE Security and Privacy, 2023. The protocol introduces a novel approach where both communication and computation are amortized over a batch of entries, resulting in significantly lower communication overhead for small entry sizes (ranging from 32 bytes to 256 bytes). Specifically, for a batch of 256 entries and an entry size of 32 bytes, the communication overhead is 11 times less compared to previous schemes.
The paper detailing the protocol can be found here.
This code relies on the Microsoft SEAL Library. Make sure to install version 4.1 of this library globally.
Before proceeding with the compilation, ensure that your system has CMake installed, preferably a version above 3.0.
After installing CMake and the Microsoft SEAL Library, navigate to the root directory of the project and execute the following commands:
cmake -S . -B build
cmake --build build
Once the build process is complete, run the following command to execute the Vectorized Batch PIR:
./build/bin/vectorized_batch_pir
This will run the Vectorized Batch PIR for the three input scenarios mentioned below:
Batch Size | Database Size | Entry Size |
---|---|---|
32 | 1048576 | 32 |
64 | 1048576 | 32 |
256 | 1048576 | 32 |
Upon processing the inputs, the terminal should display a similar output:
The performance of the protocol heavily relies on the selection of fully homomorphic encryption (FHE) parameters. We have provided the best-performing parameters for the given example inputs. However, we encourage developers to select the parameters that yield the best performance for their specific applications. Please refer to this section for parameter selection details.
Acknowledgment: Sun I (is16@illinois.edu) for helping with testing the code
This implementation is intended for research purposes only. The code has NOT been vetted by security experts. Therefore, no part of this code should be used in any real-world or production setting.