Salesforce.com Analytics Cloud DatasetUtils is a reference implementation of the Analytics cloud External data API.
Download the latest jar from releases and follow the examples below:
Download and install Java JDK (not JRE) from Oracle
After installation is complete. open a console and check that the java version is 1.7 or higher by running the following command:
java -version
To start the jar in server mode and use the web UI to upload open a console and enter the following command:
java -jar datasetutil-<version>.jar --server true
Best is to run in interactive mode. open a terminal and type in the following command and follow the prompts on the console:
java -jar datasetutil-<version>.jar
Or you can pass in all the param in the command line and let it run uninterrupted.
java -jar datasetutil-<version>.jar --action <action> --u <user@domain.com> --dataset <dataset> --app <app> --inputFile <inputFile> --endpoint <endPoint>
Input Parameter
--action :"load" OR "defineExtractFlow" OR "defineAugmentFlow" OR "downloadxmd" OR "uploadxmd" OR "detectEncoding" OR "downloadErrorFile"
load: for loading csv
defineAugmentFlow: for augmenting existing datasets
downloadxmd: to download existing xmd files
uploadxmd: for uploading user.xmd.json
defineExtractFlow: for extracting data from Salesforce
detectEncoding: To detect the encoding of the inputFile
downloadErrorFile: To downloading the error file for csv upload jobs
--u : Salesforce.com login
--p : (Optional) Salesforce.com password,if omitted you will be prompted
--token : (Optional) Salesforce.com token
--endpoint: (Optional) The Salesforce soap api endpoint (test/prod) Default: https://login.salesforce.com/services/Soap/u/31.0
--dataset : (Optional) the dataset alias. required if action=load
--datasetLabel : (Optional) the dataset label, Defaults to dataset alias.
--app : (Optional) the app/folder name for the dataset
--operation : (Optional) the operation for load (Overwrite/Upsert/Append/Delete) Default is Overwrite
--inputFile : (Optional) the input csv file. required if action=load
--rootObject: (Optional) the root SObject for the extract
--rowLimit: (Optional) the number of rows to extract, -1=all, default=1000
--sessionId : (Optional) the Salesforce sessionId. if specified,specify endpoint
--fileEncoding : (Optional) the encoding of the inputFile default UTF-8
--CodingErrorAction:(optional) What to do in case input characters are not UTF8: IGNORE|REPORT|REPLACE. Default REPORT. If you change this option you risk importing garbage characters
--uploadFormat : (Optional) the whether to upload as binary or csv. default binary");
OR
--server : set this to true if you want to run this in server mode and use the UI. If you give this param all other params will be ignored
java -jar datasetutils-32.0.0.jar --server true
java -jar datasetutils-32.0.0.jar --action load --u pgupta@force.com --p @#@#@# --inputFile Opportunity.csv --dataset puntest
java -jar datasetutils-32.0.0.jar --action downloadxmd --u pgupta@force.com --p @#@#@# --dataset puntest
java -jar datasetutils-32.0.0.jar --action uploadxmd --u pgupta@force.com --p @#@#@# --inputFile user.xmd.json --dataset puntest
java -jar datasetutils-32.0.0.jar --action defineAugmentFlow --u pgupta@force.com --p @#@#@#
java -jar datasetutils-32.0.0.jar --action defineExtractFlow --u pgupta@force.com --p @#@#@# --rootObject OpportunityLineItem
java -jar datasetutils-32.0.0.jar --action detectEncoding --inputFile Opportunity.csv
java -jar datasetutils-32.0.0.jar --action downloadErrorFile --u pgupta@force.com --p @#@#@# --dataset puntest
git clone git@github.com:forcedotcom/Analytics-Cloud-Dataset-Utils.git
mvn clean install