Simple Gradle plugin that uploads and downloads S3 objects. This is a fork of the mygrocerydeals/gradle-s3-plugin, which no longer appears to be under active development. It has been updated to work with Gradle version 6 and later and convert to pure Java.
Add the following to your build.gradle file:
plugins {
id 'io.jumpco.open.gradle.s3' version '1.1.1'
}
See gradle plugin page for other versions.
When performing uploads you need to provide s3.region
as follows:
s3 {
region = 'us-east-1'
}
By default, the S3 plugin searches for credentials in the same order as the AWS default credentials provider chain.
You can specify a profile by setting the project s3.profile
or s3.awsAccessKeyId
and s3.awsSecretAccessKey
.
The provided access key and secret will take precedence.
s3 {
profile = 'my-profile'
awsAccessKeyId = '12345678'
awsSecretAccessKey = 'my-secret'
}
Setting the environment variables AWS_ACCESS_KEY_ID
and AWS_SECRET_ACCESS_KEY
is one way to provide your S3 credentials. See the AWS Docs for details on credentials.
The s3.region
property can optionally be set to define the Amazon EC2 region if one has not been set in the authentication profile. It can also be used to override the default region set in the AWS credentials provider.
s3 {
region = 'us-east-1'
}
The s3.bucket
property sets a default S3 bucket that is common to all tasks. This can be useful if all S3 tasks operate against the same Amazon S3 bucket.
s3 {
bucket = 'my.default.bucketname'
}
s3Uploads {
jobName {
key = 'target-filename'
file = 'source-filename'
}
}
s3Downloads {
dlJob {
keyPrefix = 'folder'
destDir = 'targetDir'
}
}
task myDownload(type: io.jumpco.open.gradle.s3.S3Download) {
keyPrefix = 'folder'
destDir = 'targetDir'
}
task myUpload(type: io.jumpco.open.gradle.s3.S3Upload) {
key = 'target-filename'
file = 'source-filename'
}
Note
Use the fully qualified name for the tasks:
io.jumpco.open.gradle.s3.S3Upload
io.jumpco.open.gradle.s3.S3Download
Job descriptions will result tasks with the name prefixed:
dlJob
will result in taskdlJobDownloadTask
jobName
will result in taskjobNameUploadTask
Uploads one or more files to S3. This task has two modes of operation: single file upload and directory upload (including recursive upload of all child subdirectories). Properties that apply to both modes:
bucket
- S3 bucket to use (optional, defaults to the projects3
configuredbucket
)awsAccessKeyId
- AWS Access Key (optional, defaults to the projects3
configuredawsAccessKeyId
)awsSecretAccessKey
AWS Access Secret (optional, defaults to the projects3
configuredawsSecretAccessKey
)
For a single file upload:
key
- key of S3 object to createfile
- path of file to be uploadedoverwrite
- (optional, default isfalse
), iftrue
the S3 object is created or overwritten if it already exists.
By default S3Upload
does not overwrite the S3 object if it already exists. Set overwrite
to true
to upload the file even if it exists.
For a directory upload:
keyPrefix
- root S3 prefix under which to create the uploaded contentssourceDir
- local directory containing the contents to be uploaded
A directory upload will always overwrite existing content if it already exists under the specified S3 prefix.
Downloads one or more S3 objects. This task has two modes of operation: single file download and recursive download. Properties that apply to both modes:
bucket
- S3 bucket to use (optional, defaults to the projects3
configuredbucket
)awsAccessKeyId
- AWS Access Key (optional, defaults to the projects3
configuredawsAccessKeyId
)awsSecretAccessKey
AWS Access Secret (optional, defaults to the projects3
configuredawsSecretAccessKey
)
For a single file download:
key
- key of S3 object to downloadfile
- local path of file to save the download to
For a recursive download:
keyPrefix
- S3 prefix of objects to downloaddestDir
- local directory to download objects to
Note:
Recursive downloads create a sparse directory tree containing the full keyPrefix
under destDir
. So with an S3 bucket
containing the object keys:
top/foo/bar
top/README
a recursive download:
s3Downloads {
downloadRecursive{
keyPrefix = 'top/foo/'
destDir = 'local-dir'
}
}
results in this local tree:
local-dir/
└── top
└── foo
└── bar
So only files under top/foo
are downloaded, but their full S3 paths are appended to the destDir
. This is different from the behavior of the aws cli aws s3 cp --recursive
command which prunes the root of the downloaded objects.
Use the flexible Gradle Copy task to prune the tree after downloading it.
For example:
def localTree = 'path/to/some/location'
task downloadRecursive(type: io.jumpco.open.gradle.s3.S3Download) {
bucket = 's3-bucket-name'
keyPrefix = "${localTree}"
destDir = "${buildDir}/download-root"
}
// prune and re-root the downloaded tree, removing the keyPrefix
task copyDownload(type: Copy, dependsOn: downloadRecursive) {
from "${buildDir}/download-root/${localTree}"
into "${buildDir}/pruned-tree"
}
Downloads report percentage progress at the gradle INFO level. Run gradle with the -i
option to see download progress.