Data Migration: Using Salesforce Data Move Utility (SFDMU) for Commerce Cloud(B2B/D2C) Data

Data Migration: Using Salesforce Data Move Utility (SFDMU) for Commerce Cloud(B2B/D2C) Data

This post is about how to export and import commerce cloud data from one Salesforce org to another and easily setup a developer sandbox.

Introduction

In this post, I will be showing how to use Salesforce Data Move Utility(SFDMU) to export commerce cloud data like Products, Categories, Catalogs, Entitlements, Pricebooks, etc from one org such as production to another org such as a developer sandbox. This is useful when you want to seed a developer sandbox with production data.

Salesforce provides sandbox refresh functionality to refresh a sandbox and updates its metadata from the source org such as production. One of the drawbacks of sandbox refresh functionality is that it doesn’t copy any actual data (records) to the target org unless that target org is of type fullcopy or partial copy. It is possible to use the dataloader application to move records but it is a very complex process to use dataloader to move records that have a complicated relationship. We already know that the commerce cloud data model is quite complicated and the commerce data has multiple relationships between different objects.

Developers work on their own environments so it should be possible for them to easily export and import commerce data from one org to another as they need to work on the latest data for easier debugging.

What is SFDMU

SFDX Data Move Utility(SFDMU) is an alternate tool to Dataloader for data migration. One of the biggest benefits of this tool is that it automatically takes care of the relationship between objects. SFDMU is a sfdx plugin so it is quite easy to install. Follow the link for installation instructions.

Official links

How to configure SFDMU

We need to tell SFDMU which records to export from the source org and import into the target org. This configuration is part of the export.json file. Check the official documentation on all the possible properties available for configuration. Below, I have summarized a few of the important properties

Field Example Description
operation Upsert Type of operation to perform
externalId Name Field to be used as externalId for Upsert operations
query SELECT Id FROM ProductCatalog SOQL Query
deleteOldData true Delete old data before importing
bulkApiVersion 1.0 Bulk API version.
concurrencyMode Serial concurrency mode to perform the bulk operations
allOrNone false All or none mode for Rest API

Note: We are using Bulk API v1 as it allows us to specify concurrency mode as serial. This mode is not supported in Bulk API v2 as v2 by default uses parallel mode so this creates issues when data needs to be imported in a specific order.

I have added a sample repository that supports exporting and importing the following commerce data into our target org.

  • WebStore
  • ProductCatalog
  • ProductCategory
  • Pricebook2
  • PricebookEntry
  • Product2
  • ProductCategoryProduct
  • WebStoreCatalog
  • ProductAttributeSet
  • ProductAttribute
  • ProductAttributeSetProduct
  • CommerceEntitlementPolicy
  • CommerceEntitlementProduct
  • WebStoreBuyerGroup
  • WebStorePricebook
  • BuyerGroup
  • BuyerGroupPricebook
  • CommerceEntitlementBuyerGroup

Prerequisites

Before importing the data into our target org we need to prepare the target org. As part of this blog post, I will be using a newly created developer sandbox that is refreshed from the production org. After your sandbox is ready to use you need to manually create the commerce store. As part of sandbox refresh, the commerce store is not created out of the box but it can be created with just a few clicks.

  1. Login to your newly created sandbox.
  2. Navigate to the “Commerce” application.
  3. Navigate to the “Stores” tab and click on the “Create a Store on Existing Site” action.
  4. It will ask you to select a store type and enter a store name.
  5. Select an appropriate store type and enter a store name. The store name should match the store name configured on production.
  6. After entering a store name, click Next and select the appropriate Site. The site name usually matches the store name.
  7. Click Next and your store should be created.
  8. Repeat the steps for all the stores you want to create.

Now, we have successfully created a store. The next step is to configure the SOQL query for Product2, ProductAttribute and ProductAttributeSetProduct objects. Every project has different custom fields for Product2 object. These custom fields are also used to create VariationParent and Variation products. As an example, in my current project we use Size__c and Color__c fields to create variation products based on color and size.

So, you would have to modify the below SOQL queries to accomodate custom fields for your project. If needed, you can also modify queries for other objects to support your project needs.

The next step now is to start with the export and import process.

Export and Import

The steps outlines below are generic and can be used to import data into a new sandbox or an existing sandbox.

Firstly, make sure to install the SFDMU sfdx plugin. After the successful installation of the plugin, we need to authorize the source and target org. SFDMU plugin makes use of the authorized and connected orgs.

1
2
3
4
5
# Authorize a production org(Source) or any other org for example.
sfdx force:auth:web:login -a sourceorg -r https://login.salesforce.com

# Authorize a developer sandbox(Target)
sfdx force:auth:web:login -a targetorg -r https://test.salesforce.com

Next, download the latest code from the repository. The repository has following folders

  • plan-delete-custom-pricebook-entries
    • This holds the configuration to delete any existing custom pricebook entries. It is mandatory to delete custom pricebook entries before deleting the standard ones.
  • plan-delete-standard-pricebook-entries
    • This holds the configuration to delete any existing standard pricebook entries.
  • plan-delete-non-variationparent-entitlements
    • This holds the configuration to delete any existing production entitlements for variation or simple products. It is mandatory to delete entitlements for variation products before deleting the entitlements of VariationParent product.
  • plan-delete-variationparent-entitlements
    • This holds the configuration to delete any existing production entitlements for variation parent products.
  • plan-copy-commerce-data
    • This is the main configuration to export all the records mentioned previously and import them to target org.

As certain objects need to be deleted in a specific order like pricebooks and entitlements it was necessary to move the configuration to their own specific folders. SFDMU supports the concept of objectSets. As part of the standard setup(export.json) it is not possible to specify the same object multiple times with different operations(insert/delete/upsert/update). This limitation is overcome by using objectSets but unfortunately in my tests, I was not able to use this feature to combine the deletion of pricebooks and product entitlements in a specific order. So, as a workaround, I used different folders.

We need to run the following commands in below specific order to start with the export/import process.

Note: If you are importing data into a newly created environment, you can skip the deletion of the pricebook entry and product entitlement records.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
# 1st delete all the existing custom pricebook entries from the target org.
# Note that we are using the org aliases "sourceorg" and "targetorg"
sfdx sfdmu:run  -p ./plan-delete-custom-pricebook-entries  --sourceusername sourceorg --targetusername targetorg --verbose

# 2nd delete all the existing standard pricebook entries from the target org.
sfdx sfdmu:run  -p ./plan-delete-standard-pricebook-entries  --sourceusername sourceorg --targetusername targetorg --verbose

# 3rd delete all the existing production entitlements for variation or simple products from the target org.
sfdx sfdmu:run  -p ./plan-delete-non-variationparent-entitlements  --sourceusername sourceorg --targetusername targetorg --verbose

# 4th delete all the existing production entitlements for variationparent products from the target org.
sfdx sfdmu:run  -p ./plan-delete-non-variationparent-entitlements  --sourceusername sourceorg --targetusername targetorg --verbose

# 5th run the main export and import
sfdx sfdmu:run  -p ./plan-copy-commerce-data  --sourceusername sourceorg --targetusername targetorg --verbose

It is possible that while running the export/import process certain warnings are logged in the console. The SFMDU utility simply logs these warnings and continues with the export/import process. You can later inspect these warnings and if needed tweak the export/import process. The logs will be present in the respective folders along with a target folder that will hold the CSV files of the data imported. These CSV files will also have a column error that will have the reason why a particular record was not imported. A separate CSV file for each object operation is created.

If everything goes fine then you should see the data from the source org copied to the target org. After this, you need to do any project specific configuration needed to have a functioning commerce store. In my case, I had to additionally manually configure the following

  • Store integrations(Taxes, Shipping, Inventory, Prices, Payments)
  • Run search indexes
  • Create buyers
  • Publish the store via Experience Bulder

Copy data to Production org

Can we use this utility to move data from a fullcopy org to a production org?

Theoretically, yes the utility can be used to move the data the other way around but I don’t feel confident in using SFDMU to move data to production org. In my numerous tests, I repeatedly had warnings about records not being imported. The warnings were mostly related to race conditions where the data was imported in the wrong order. The number of records impacted were not high and was around 1 - 5% of the total number of records being imported. So, I would suggest using this utility to move data to production at your own risk.

As an alternative if the source of truth for product data is an external system like ERP then you can leverage that connection to move data to production org.