Schedule a Salesforce transfer
The BigQuery Data Transfer Service for Salesforce lets you automatically schedule and manage recurring load jobs from Salesforce into BigQuery.
Limitations
Salesforce transfers are subject to the following limitations:
- The BigQuery Data Transfer Service for Salesforce only supports the Salesforce Bulk API to connect to the Salesforce instance, and only supports the transfer of entities which are supported by the Salesforce Bulk API. For more information about what entities are supported, see 'Entity is not supported by the Bulk API' error.
- The minimum interval time between recurring transfers is 15 minutes. The default interval for a recurring transfer is 24 hours.
- The BigQuery Data Transfer Service uses Salesforce Bulk API v1 to connect to the Salesforce endpoint to retrieve data.
Before you begin
The following sections describe the steps that you need to take before you create a Salesforce transfer.
Salesforce prerequisites
Create a Salesforce Connected App with the following configurations:
- Generate a Security Token for the app.
Allow OAuth username and password flow by doing the following:
- In the connected app, click Setup.
In the Quick find field, search for OAuth and OpenID Connect Settings and enable.
Enable self-authorization for a user for the connected app by doing the following:
- In the connected app, click Setup.
- In the Quick find field, search for Manage connected apps.
- Click Edit on the connected app that you are using for the transfer run.
Under OAuth policies, click the Permitted Users menu and select All users may self-authorize.
Under IP Relaxation, select Relax IP restrictions.
You must also have the following Salesforce information when creating a Salesforce transfer:
Parameter Name | Description |
---|---|
clientId |
ClientId or Consumer Key of the Salesforce connected application. |
clientSecret |
OAuth Client Secret or Consumer Secret of the Salesforce connected application. |
username |
Username of the Salesforce account. |
password |
Password of the Salesforce account. |
securityToken |
Security Token of the Salesforce account. This security token is a case-sensitive alphanumeric code that gets appended to the password. The security token is required when accessing the Salesforce APIs from outside the Trusted IP range of your Salesforce configuration. |
BigQuery prerequisites
- Verify that you have completed all actions required to enable the BigQuery Data Transfer Service.
- Create a BigQuery dataset to store your data.
- If you intend to set up transfer run notifications for Pub/Sub,
ensure that you have the
pubsub.topics.setIamPolicy
Identity and Access Management (IAM) permission. Pub/Sub permissions are not required if you only set up email notifications. For more information, see BigQuery Data Transfer Service run notifications.
Required BigQuery roles
To get the permissions that you need to create a transfer,
ask your administrator to grant you the
BigQuery Admin (roles/bigquery.admin
) IAM role.
For more information about granting roles, see Manage access.
This predefined role contains the permissions required to create a transfer. To see the exact permissions that are required, expand the Required permissions section:
Required permissions
The following permissions are required to create a transfer:
-
bigquery.transfers.update
on the user -
bigquery.datasets.get
on the target dataset -
bigquery.datasets.update
on the target dataset
You might also be able to get these permissions with custom roles or other predefined roles.
Set up an Salesforce data transfer
To create a Salesforce data transfer:
Console
In the Google Cloud console, go to the BigQuery page.
Click Data transfers > Create a transfer.
In the Source type section, for Source, choose Salesforce.
In the Transfer config name section, for Display name, enter a name for the transfer.
In the Schedule options section:
In the Repeat frequency list, select an option to specify how often this transfer runs. To specify a custom repeat frequency, select Custom. If you select On-demand, then this transfer runs when you manually trigger the transfer.
If applicable, select either Start now or Start at set time and provide a start date and run time.
In the Destination settings section, for Dataset, choose the dataset you created to store your data.
In the Data source details section, do the following:
- For Custom Domain, enter a custom login domain
if applicable. If your Salesforce login URL is
login.salesforce.com
ortest.salesforce.com
, leave this field blank. For Salesforce Url, select the suffix that your Salesforce login URL ends with. If you select
my.salesforce.com
orsandbox.my.salesforce.com
, these values will suffix the custom domain that you provided in the Custom domain field.For example, if you provided the custom domain
mydomain
and selectedmy.salesforce.com
, the login URL will bemydomain.my.salesforce.com
.For Username, enter the username of the Salesforce account.
For Password, enter the password of the Salesforce account.
For Security token, enter the security token of the Salesforce account.
For Client ID, enter the Salesforce connected application Consumer Key.
For Client secret, enter the Salesforce connected application Consumer Secret.
For Salesforce objects to transfer, click Browse to select any objects to be transferred to the BigQuery destination dataset.
- You can also manually enter any objects to include in the transfer in this field.
- For Custom Domain, enter a custom login domain
if applicable. If your Salesforce login URL is
In the Service Account list, select a service account associated with your Google Cloud project. The selected service account must have the required roles to run this transfer.
If you signed in with a federated identity, then a service account is required to create a transfer. If you signed in with a Google Account, then a service account for the transfer is optional.
For more information about using service accounts with data transfers, see Use service accounts.
Optional: In the Notification options section, do the following:
- To enable email notifications, click the Email notification toggle. When you enable this option, the transfer administrator receives an email notification when a transfer run fails.
- To enable Pub/Sub transfer run notifications for this transfer, click the Pub/Sub notifications toggle. You can select your topic name, or you can click Create a topic to create one.
Click Save.
bq
Enter the bq mk
command
and supply the transfer creation flag
--transfer_config
:
bq mk \ --transfer_config \ --project_id=PROJECT_ID \ --data_source=DATA_SOURCE \ --display_name=NAME \ --target_dataset=DATASET \ --params='PARAMETERS'
Where:
- PROJECT_ID (optional): your Google Cloud project ID.
If
--project_id
isn't supplied to specify a particular project, the default project is used. - DATA_SOURCE: the data source —
salesforce
. - NAME: the display name for the transfer configuration. The transfer name can be any value that lets you identify the transfer if you need to modify it later.
- DATASET: the target dataset for the transfer configuration.
PARAMETERS: the parameters for the created transfer configuration in JSON format. For example:
--params='{"param":"param_value"}'
. The following are the parameters for a Salesforce transfer:connector.authentication.oauth.clientId
: ClientId or Consumer Key of the Salesforce connected application.connector.authentication.oauth.clientSecret
: OAuth Client Secret or Consumer Secret of the Salesforce connected application.connector.customDomainName
(optional): the Salesforce Custom Login Domain, if applicable. Leave empty to use the default login domainlogin.Salesforce.com
.connector.authentication.username
: the username of the Salesforce account.connector.authentication.password
: the password of the Salesforce account.connector.authentication.securityToken
: the security token of the Salesforce account.assets
: the path to the Salesforce objects to be transferred to BigQuery.
For example, the following command creates a Salesforce transfer in the default project with all the required parameters:
bq mk \ --transfer_config \ --target_dataset=mydataset \ --data_source=salesforce \ --display_name='My Transfer' \ --params='{"assets":["Account"], "connector.authentication.oauth.clientId": "1234567890", "connector.authentication.oauth.clientSecret":"ABC12345", "connector.customDomainName":"MyDomainName", "connector.authentication.username":"[email protected]", "Connector.authentication.password":"abcdef1234", "connector.authentication.securityToken":"a1hghbb44lnl465lbl75b",}'
API
Use the projects.locations.transferConfigs.create
method and supply an instance of the TransferConfig
resource.
Pricing
There is no cost to transfer Salesforce data into BigQuery while this feature is in Preview.
Troubleshoot transfer setup
If you are having issues setting up your transfer, see Salesforce transfer issues.
What's next
- For an overview of the BigQuery Data Transfer Service, see Introduction to BigQuery Data Transfer Service.
- For information on using transfers including getting information about a transfer configuration, listing transfer configurations, and viewing a transfer's run history, see Working with transfers.
- Learn how to load data with cross-cloud operations.