Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion messages/bulk.status.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ Run this command using the job ID or batch ID returned from the "<%= config.bin

<%= config.bin %> <%= command.id %> --job-id 750xx000000005sAAA

- View the status of a bulk load job and a specific batches in an org with alias my-scratch:
- View the status of a bulk load job and a specific batch in an org with alias my-scratch:

<%= config.bin %> <%= command.id %> --job-id 750xx000000005sAAA --batch-id 751xx000000005nAAA --target-org my-scratch

Expand Down
2 changes: 1 addition & 1 deletion messages/bulk.upsert.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ By default, the job runs the batches in parallel, which we recommend. You can ru

- Bulk upsert records to the Contact object in your default org:

<%= config.bin %> --sobject Contact --file files/contacts.csv --external-id Id
<%= config.bin %> <%= command.id %> --sobject Contact --file files/contacts.csv --external-id Id

- Bulk upsert records to a custom object in an org with alias my-scratch and wait 5 minutes for the command to complete:

Expand Down
2 changes: 1 addition & 1 deletion messages/data.resume.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ Run this command using the job ID or batch ID returned from the "<%= config.bin

<%= config.bin %> <%= command.id %> --job-id 750xx000000005sAAA

- View the status of a bulk load job and a specific batches:
- View the status of a bulk load job and a specific batch:

<%= config.bin %> <%= command.id %> --job-id 750xx000000005sAAA --batch-id 751xx000000005nAAA

Expand Down
2 changes: 1 addition & 1 deletion messages/importApi.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ There are references in a data file %s that can't be resolved:

# error.RefsInFiles

The file %s includes references (ex: '@AccountRef1'). Those are only supported with --plan, not --files.`
The file %s includes references (ex: '@AccountRef1'). Those are only supported with --plan, not --files.

# error.noRecordTypeName

Expand Down
2 changes: 1 addition & 1 deletion messages/record.update.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ Updates a single record of a Salesforce or Tooling API object.

Specify the record you want to update with either its ID or with a list of field-value pairs that identify the record. If your list of fields identifies more than one record, the update fails; the error displays how many records were found.

When using field-value pairs for both identifying the record and specifiyng the new field values, use the format <fieldName>=<value>. Enclose all field-value pairs in one set of double quotation marks, delimited by spaces. Enclose values that contain spaces in single quotes.
When using field-value pairs for both identifying the record and specifying the new field values, use the format <fieldName>=<value>. Enclose all field-value pairs in one set of double quotation marks, delimited by spaces. Enclose values that contain spaces in single quotes.

This command updates a record in Salesforce objects by default. Use the --use-tooling-api flag to update a Tooling API object.

Expand Down
2 changes: 1 addition & 1 deletion messages/tree.import.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ Plan definition file to insert multiple data files.

# flags.plan.description

Unlike when you use the `--files` flag, the files listed in the plan definition file **can** contain more then 200 records. When the CLI executes the import, it automatically batches the records to comply with the 200 record limit set by the API.
Unlike when you use the `--files` flag, the files listed in the plan definition file **can** contain more than 200 records. When the CLI executes the import, it automatically batches the records to comply with the 200 record limit set by the API.

The order in which you list the files in the plan definition file matters. Specifically, records with lookups to records in another file should be listed AFTER that file. For example, let's say you're loading Account and Contact records, and the contacts have references to those accounts. Be sure you list the Accounts file before the Contacts file.

Expand Down
2 changes: 1 addition & 1 deletion src/api/data/tree/importFiles.ts
Original file line number Diff line number Diff line change
Expand Up @@ -74,7 +74,7 @@ const logFileInfo =
return fileInfo;
};

/** check the tree files for references, throw error telling user they are only supported with `--plan */
/** check the tree files for references, throw error telling user they are only supported with `--plan` */
export const validateNoRefs = (fileInfo: FileInfo): FileInfo => {
if (hasUnresolvedRefs(fileInfo.records)) {
Messages.importMessagesDirectoryFromMetaUrl(import.meta.url);
Expand Down
2 changes: 1 addition & 1 deletion src/bulkIngest.ts
Original file line number Diff line number Diff line change
Expand Up @@ -365,7 +365,7 @@ export const lineEndingFlag = Flags.option({
})();

/**
* Use only for commands that maintain sfdx compatibility.1
* Use only for commands that maintain sfdx compatibility.
*
* @deprecated
*/
Expand Down
6 changes: 3 additions & 3 deletions src/bulkUtils.ts
Original file line number Diff line number Diff line change
Expand Up @@ -324,11 +324,11 @@ export async function detectDelimiter(filePath: string): Promise<ColumnDelimiter
}

// default to `COMMA` if no delimiter was found in the CSV file (1 column)
const columDelimiter = delimiterMap.get(detectedDelimiter ?? ',');
const columnDelimiter = delimiterMap.get(detectedDelimiter ?? ',');

if (columDelimiter === undefined) {
if (columnDelimiter === undefined) {
throw new SfError(`Failed to detect column delimiter used in ${filePath}.`);
}

return columDelimiter;
return columnDelimiter;
}
2 changes: 1 addition & 1 deletion src/commands/data/import/resume.ts
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@ export default class DataImportResume extends SfCommand<DataImportResumeResult>

return bulkIngestResume({
cmdId: 'data import resume',
stageTitle: 'Updating data',
stageTitle: 'Importing data',
cache: await BulkImportRequestCache.create(),
jobIdOrMostRecent: flags['job-id'] ?? flags['use-most-recent'],
jsonEnabled: this.jsonEnabled(),
Expand Down
2 changes: 1 addition & 1 deletion src/commands/data/update/record.ts
Original file line number Diff line number Diff line change
Expand Up @@ -82,7 +82,7 @@ export default class Update extends SfCommand<SaveResult> {
const conn = flags['use-tooling-api']
? flags['target-org'].getConnection(flags['api-version']).tooling
: flags['target-org'].getConnection(flags['api-version']);
// oclif isn't smart of enough to know that if record-id is not set, then where is set
// oclif isn't smart enough to know that if record-id is not set, then where is set
const sObjectId = flags['record-id'] ?? ((await query(conn, flags.sobject, flags.where as string)).Id as string);
try {
const updateObject = { ...stringToDictionary(flags.values), Id: sObjectId };
Expand Down