-
Notifications
You must be signed in to change notification settings - Fork 49
CLI: Command Line Interface
Credentialdigger offers some base functionalities with a command line interface.
Obviously, you need to install the credentialdigger
package first. You can refer to the README.md
for this.
All the commands support both sqlite and postgres databases. In order to use sqlite you will need to set the path of the db as argument (--sqlite /path/to/data.db
), whereas for postgres you can either export all the credentials as environment variables or pass an .env
file as argument (more on this later).
-
Download models(deprecated from v4.4.0) - Add rules
- Scan a repository
- Scan the snapshot of a repository
- Scan a user
- Scan wiki page
- Scan local files and directories
This feature has been deprecated in v4.4.0, when we implemented automatic download of models
Download and link a machine learning model.
Refer to Machine Learning Models for the complete explanation of how machine learning models work.
python -m credentialdigger download model_name
Add the rules contained in a file, that will be used to scan a repository.
path_to_rules <Required> The path of the file that contains the rules.
--dotenv DOTENV <Optional> The path to the .env file which will be used in all
commands. If not specified, the one in the current
directory will be used (if present).
--sqlite SQLITE <Optional> If specified, scan the repo using the sqlite client
passing as argument the path of the db. Otherwise, use postgres
(must be up and running)
Sqlite:
# Add the rules to the database using sqlite
python -m credentialdigger add_rules /path/to/rules.yml --sqlite /path/to/mydata.db
Postgres:
# Add the rules to the database using postgres
export POSTGRES_USER=...
export ...
python -m credentialdigger add_rules /path/to/rules.yml
or
# Add the rules to the database using postgres and an environment file
python -m credentialdigger add_rules /path/to/rules.yml --dotenv /path/to/.env
TIP: if your env file is in the current directory and it's named
.env
, you don't need to specify the--dotenv
parameter.
The scan command allows to scan a git repo directly via the command line. It can accept multiple arguments:
repo_url <Required> The URL of the git repository to be
scanned.
--dotenv DOTENV The path to the .env file which will be used in all
commands. If not specified, the one in the current
directory will be used (if present).
--sqlite SQLITE If specified, scan the repo using the sqlite client
passing as argument the path of the db. Otherwise, use
postgres (must be up and running)
--category CATEGORY If specified, scan the repo using all the rules of
this category, otherwise use all the rules in the db
--models MODELS [MODELS ...]
A list of models for the ML false positives detection.
Cannot accept empty lists.
--debug Flag used to decide whether to visualize the
progressbars during the scan (e.g., during the
insertion of the detections in the db)
--git_token GIT_TOKEN
Git personal access token to authenticate to the git
server
--local If True, get the repository from a local directory
instead of the web
--force Force a complete re-scan of the repository, in case it
has already been scanned previously
--generate_snippet_extractor
Generate the extractor model to be used in the
SnippetModel. The extractor is generated using the
ExtractorGenerator. If `False`, use the pre-trained
extractor model
--similarity Build and use the similarity model to compute
embeddings and allow for automatic update of similar
snippets
Sqlite:
python -m credentialdigger scan https://github.com/user/repo --sqlite /path/to/mydata.db --models PathModel SnippetModel
Postgres:
export POSTGRES_USER=... # either export variables or use --dotenv
python -m credentialdigger scan https://github.com/user/repo [--dotenv /path/to/my/.env] --models PathModel SnippetModel
The scan command also returns an exit status that is equal to the number of discoveries it has made during the scan. Here are two samples on how we can make use of the exit status.
python -m credentialdigger scan https://github.com/user/repo
# $? = exit status
if [ $? -gt 0 ]; then
echo "This repo contains leaks"
else
echo "This repo contains no leaks"
fi
public class credentialdigger{
public static void main(String[] args) {
String command = "python -m credentialdigger scan https://github.com/user/repo";
try {
Process p = Runtime.getRuntime().exec(command);
p.waitFor();
int numberOfDiscoveries = p.exitValue();
if(numberOfDiscoveries>0){
System.out.println("This repo contains leaks.");
}
else{
System.out.println("This repo contains no leaks.");
}
} catch (Exception e) {
//IGNORE
}
}
}
Scan the snapshot of a repository, i.e., scan the repository at a given commit id, or at the last commit id of a given branch. The arguments are the same as in scan
, plus the following:
--snapshot <Required> The name of the branch, or the commit id
Sqlite:
python -m credentialdigger scan_snapshot https://github.com/user/repo --snapshot my_branch_name --sqlite /path/to/mydata.db --models PathModel SnippetModel
Postgres:
export POSTGRES_USER=... # either export variables or use --dotenv
python -m credentialdigger scan_snapshot https://github.com/user/repo --snapshot my_branch_name [--dotenv /path/to/my/.env] --models PathModel SnippetModel
Scan all the public repositories of a user. The arguments are the same as in scan
plus the following:
--forks <Optional> Scan also repositories forked by this user
--api_endpoint API_ENDPOINT
<Optional> API endpoint of the git server
Sqlite:
python -m credentialdigger scan_user username --sqlite /path/to/mydata.db --models PathModel SnippetModel
Postgres:
export POSTGRES_USER=... # either export variables or use --dotenv
python -m credentialdigger scan_user username [--dotenv /path/to/my/.env] --models PathModel SnippetModel
Scan the wiki page of a project. All the arguments are the same as in scan
.
Sqlite:
python -m credentialdigger scan_wiki https://github.com/user/repo --sqlite /path/to/mydata.db
Postgres:
export POSTGRES_USER=... # either export variables or use --dotenv
python -m credentialdigger scan_wiki https://github.com/user/repo [--dotenv /path/to/my/.env]
The 'scan_path' module can be used to scan a local directory or file on the fly from the terminal.
python -m credentialdigger scan_path path/to/scan
python -m credentialdigger scan_path path/to/scan --max_depth 10
A scan can be made with the --max_depth argument enabled. Maximum depth describes the maximum number of subdirectories to be scanned. If it is set to -1 or not specified, all subdirectories will be scanned.
- Installation instructions: Readme
- Preparation for the scanner's rules
- Deploy over HTTPS (Optional)
- How to update the project
- How to install on MacOS ARM
- Python library
- CLI
- Web UI through the Docker installation
- Pre-commit hook