Getting started with Salesforce integration patterns | MuleSoft (2024)

Overview

The Salesforce Connector enables developers to create applications that react to common Salesforce events such as adding, changing, or deleting objects. With the Salesforce Connector, you can connect directly to the Salesforce APIs without needing to write any code. The only credentials you will need for the Salesforce connector is a Salesforce developer account and a Salesforce security token. In many business applications, you will need to connect Salesforce to existing databases, ERP systems, and custom applications. These integrations can be easily built and executed using MuleSoft’s Anypoint Studio.

As a developer, there are five common Salesforce integration patterns that you should be familiar with when attempting to integrate Salesforce with MuleSoft. You can read more about these integration patterns by clicking the link to read the Top 5 Salesforce Integration whitepaper. In this tutorial, we will be covering the five common Salesforce integration patterns:

  1. Migration
  2. Broadcast
  3. Aggregation
  4. Bidirectional synchronization
  5. Correlation

Signup for free to build your first Salesforce integration

The first step is to Sign up for Anypoint Platform for free. Click the button below to create a free account.

Start free trial

Already have an account? Sign in.

To download the assets used in the project, feel free to download the jar file located here and import it into your Anypoint Studio project. To import go to File -> Import -> Packaged Mule application (JAR)Download JAR file

Migration

Getting started with Salesforce integration patterns | MuleSoft (1)

Data migration is when you move a specific set of data from one system to another. This migration pattern applies to numerous Salesforce integration use cases, such as migrating data from a legacy ERP system to Salesforce, or consolidating CRM systems for example. This pattern is intended to handle large volumes of data, and you can process records in batches using the Batch Connector.

In the above scenario, we have set up a flow that listens for a successful request to our HTTP endpoint. Once the HTTP endpoint is hit, it will select values from a database, and insert each of those values as a new lead in Salesforce. Let’s walk through how this was made and how you can build this same integration in your own Anypoint Studio project.

First, go to File -> New -> Mule Project then go to the Mule Palette. Add the HTTP module to your project, and drag the HTTP Listener into your flow. Set the port to 8081, the path to: /salesforce. Next, go to the Mule Palette, and add the Database Module to your project. Drag the Select connector into your flow.

Click on the Database connector, and set up your connector configuration by clicking the green plus. Select MySQL Connection at the top Connection dropdown field. Then, click the Configure button and select Add recommended libraries to automatically assign drivers to your connector.

Getting started with Salesforce integration patterns | MuleSoft (2)

Once you add the JDBC Driver, add the following database credentials:

  • Host: congo.c3w6upfzlwwe.us-west-1.rds.amazonaws.com
  • Port: 3306
  • User: mulesoft
  • Password: mulesoft
  • Database: congo

Next, click the Test Connection button to verify that the connector is able to successfully connect to the database. Click OK then, in the Query field, add the following MySQL Query Text:

  • SELECT * FROM contacts;

Next, add the Transform Message Connector to your flow and copy and paste the following DataWeave code:

123456789
%dw 2.0output application/json---payload map(item, index) ->{FirstName: item.FirstName,LastName: item.LastName,Email: item.Email,Company: item.Company}

Next, add the For Each Connector to your flow. This will iterate through every value in the database and add a new lead for each database entry.

Next, go back to your Mule Palette and add the module called Salesforce to your project. Drag the Create Salesforce Connector so it’s located inside of the For Each scope. Set up your Salesforce Config and include your username, password, and security token. Once you Test Connection to verify everything is working, click OK. Under Type, select Lead, and under Records type: [payload]

Next, add a Batch Job connector to the scene, and drag the Create Connector under the Batch Step scope. This will process all process the records asynchronously limiting API calls to Salesforce.

Nice job! You have just finished your first flow. When you right-click and run your project, and make a POST request to http://0.0.0.0:8081/salesforce using Postman, it will input each entry in your Database as a new lead. Login to SFDC, in the main navigation, go to Sales, then go to Leads to view all of the lead’s input from the Database.

Getting started with Salesforce integration patterns | MuleSoft (3)

Broadcast

Getting started with Salesforce integration patterns | MuleSoft (4)

The broadcast pattern moves data from a single source system to multiple destination systems in real-time. This is considered to be a “one-way sync” and is optimized for processing records as quickly as possible. Broadcast patterns are also used to keep data up to date between multiple systems.

In the above screenshot, we have a flow that executes when Salesforce detects a new Lead has been added to Salesforce. When a new lead has been added, the flow will transform the message payload, and write that message to two local CSV documents. Once both of those actions execute, it will print out in the console that the flow has been successfully executed.

The Scatter-Gather Connector will require both actions to be performed until it moves to the next step in the flow. This is very powerful if you need to write to different ERP systems, databases, etc and then continue the flow execution once the data has been sent to all existing systems. Let’s walk through how we created this integration:

To start, drag the On Modified Object Connector from the Mule Palette on the canvas. This will create a new flow. Click on the On Modified Object Connector and select Lead under Object type. Next, add a Transform Message Connector to your flow.

123456789
%dw 2.0output application/json---[payload] map {"FirstName": payload.FirstName,"LastName": payload.LastName,"Company": payload.Company,"Email": payload.Email}

Next, in the Mule Palette, under Core, select the Scatter-Gather Connector. The Scatter-Gather component sends the same message to multiple message processors in parallel. This means that the flow will wait to execute until both of the operations have been successfully executed. In this case, we are going to add two File Write Connectors to the Scatter-Gather. For each File Write, add the following DataWeave code under Content:

123456789
%dw 2.0output application/csv header = false, headerLineNumber = 0---payload map ( payload01 , indexOfPayload01 ) -> {FirstName: payload01.FirstName as String,LastName: payload01.LastName as String,Company: payload01.Company as String,Email: payload01.Email as String}

Under Path, the folder location you would like these files to be created in. End the expression with Accounts.csv and for the second Write add Accounts2.csv. Make sure under Write Mode you select APPEND.

That’s it! Now when you add a new Lead to Salesforce, this flow will execute and will create two matching CSV files on your machine with the lead information inserted. You can use similar logic to broadcast this data to other systems in parallel using Scatter Gather.

Aggregation

Getting started with Salesforce integration patterns | MuleSoft (5)

Aggregation is the simplest way to extract and process data from multiple systems into one application. With the aggregation pattern, developers can easily query multiple systems and merge data to output to their target system. Common use cases for aggregation include merging CSV files together and sending the desired output to Salesforce.

In the above screenshot, we have a flow that will execute when our HTTP endpoint is hit with a POST request. When the flow executes, it will query two CSV files located on different servers, merge the data together, and upload those entries as a new lead in Salesforce. The first CSV file contains fname, lname, company, email and uuid. The second CSV file includes uuid, annualrevenue, and phone. The above data transformation will use UUID as a key, and add Phone and AnnualRevenue to the proper JSON objects in the first CSV document. Essentially, we are merging the data together to create one single output.

To develop this integration, we first drag an HTTP listener into the scene. Then we drag the Scatter-Gather Connector and then drag two HTTP requests into the flow. Each HTTP request is going to be set to the Method GET and the URLs will be:

https://mulesoft.s3-us-west-1.amazonaws.com/userdata.csv

https://mulesoft.s3-us-west-1.amazonaws.com/moreuserdata.csv

Then drag two Set Variable Connectors into the scene, and set each one equal to payload and name them csv1 and csv2.

Next, drag a Transform Message as the next component in the flow. Add the following DataWeave code to the Transform Message:

123456789101112
%dw 2.0import * from dw::core::Arraysoutput application/json---vars.csv1 map(item, index) ->{FirstName: item.fname,LastName: item.lname,Email: item.email,Company: item.company,AnnualRevenue: vars.csv2.AnnualRevenue[indexOf(vars.csv2.uuid, item.uuid)],Phone: vars.csv2.Phone[indexOf(vars.csv2.uuid, item.uuid)]}

This code takes the index of the UUID, and adds AnnualRevenue and Phone to the correct entries in the other CSV document. We are essentially matching up the indexes, then inserting the corresponding values where appropriate.

Finally, add a For Each Connector, and inside of the For Each add the Salesforce Create Connector. Under type, select Lead and under records type: [payload]

Add a Batch Job connector to the scene, and drag the Create Connector under the Batch Step scope.

Nice job! You have now successfully created an aggregation integration between multiple systems and Salesforce.

Bidirectional synchronization

Getting started with Salesforce integration patterns | MuleSoft (6)

Bidirectional sync is the act of uniting two or more data sets from two or more different systems to behave as one single system that recognizes the existence of different data sets. This type of integration is useful when different tools or different systems, which are needed in their own right and for their own specific purposes, must accomplish different functions on the same data set. When you apply bidirectional sync to Salesforce, you can use Salesforce as the primary system of record and then synchronize that with a secondary system such as a database, ERP system etc. Bidirectional sync integration enables each system to perform optimally while maintaining data integrity across both synchronized systems. This provides flexibility to modularly add and remove systems without the worry of losing data.

In the above screenshot, the flow is bidirectionally syncing accounts between Salesforce and database instances. The flow will fetch data from new or modified accounts that have been added in either Salesforce or the database instances. For the accounts that were identified as not present in the target instance, the integration triggers an insert or update operation on the existence of the object in the target instance taking the last modification of the object as the one that should be applied.

To try this template out yourself, check out the file hosted on Exchange which you can download and try out right in your Anypoint Studio project.

Try it out in Exchange

Correlation

Correlation and bidirectional sync are very similar but the patterns have one critical difference. Whereas bidirectional synchronization aims to replicate the same data elements in two locations, correlation is used to associate disparate data records without copying the data itself. Bidirectional synchronization will create new records if they are found in one system and not the other. The correlation pattern is not discerning in terms of the origination of objects. It will agnostically synchronize objects as long as they are found in both systems.

Correlation is useful for cases in which two groups or systems only want to share data, but only if they both have records representing the same items or contacts in reality. The correlation pattern is most useful when extra data is more costly than beneficial as it scopes out the “unnecessary” data. For example, hospitals in the same health care network may want to correlate patient data for shared patients across hospitals, but it would be a privacy violation to share patient data with a hospital that has never admitted or treated the patient.

With the correlation pattern, the most important consideration is the definition of the term “same’ across records. This definition can vary by industry and consequences for unclear definitions also vary. For example, in targeting offers to customers, the same name may be close enough; however, in a hospital, relying on a name could have serious consequences if two patients have the same name and different courses of treatment. The table below illustrates what can occur when the definition of “same” is too strict, too lax, or accurate across correlation and bidirectional sync.

Conclusion

Thank you so much for reading this tutorial on how to build integrations to connect to Salesforce. We hope that covering each integration pattern with a working example helps paint the picture of how powerful the MuleSoft platform is, and how easy it is to integrate with the Salesforce platform using MuleSoft. Please rate the tutorial below, and if you want to read more introductory developer content on MuleSoft, please visit the developer tutorials homepage.

Getting started with Salesforce integration patterns | MuleSoft (2024)
Top Articles
Explore the UPI Transaction History with Bajaj Finserv
Why Do People Buy Timeshares?
Mybranch Becu
Kostner Wingback Bed
Rosy Boa Snake — Turtle Bay
Kem Minnick Playboy
Craigslist Vans
1970 Chevrolet Chevelle SS - Skyway Classics
Gameplay Clarkston
Clafi Arab
Prices Way Too High Crossword Clue
Declan Mining Co Coupon
Find The Eagle Hunter High To The East
Elle Daily Horoscope Virgo
Jscc Jweb
OSRS Dryness Calculator - GEGCalculators
RBT Exam: What to Expect
Cinebarre Drink Menu
Equipamentos Hospitalares Diversos (Lote 98)
Stardew Expanded Wiki
Bing Chilling Words Romanized
My Homework Lesson 11 Volume Of Composite Figures Answer Key
Healthier Homes | Coronavirus Protocol | Stanley Steemer - Stanley Steemer | The Steem Team
ABCproxy | World-Leading Provider of Residential IP Proxies
Rufus Benton "Bent" Moulds Jr. Obituary 2024 - Webb & Stephens Funeral Homes
Georgia Cash 3 Midday-Lottery Results & Winning Numbers
Defending The Broken Isles
Piri Leaked
15 Primewire Alternatives for Viewing Free Streams (2024)
Copper Pint Chaska
Cable Cove Whale Watching
Harrison 911 Cad Log
Spirited Showtimes Near Marcus Twin Creek Cinema
FREE Houses! All You Have to Do Is Move Them. - CIRCA Old Houses
El agente nocturno, actores y personajes: quién es quién en la serie de Netflix The Night Agent | MAG | EL COMERCIO PERÚ
Wsbtv Fish And Game Report
Myfxbook Historical Data
Ludvigsen Mortuary Fremont Nebraska
Sunrise Garden Beach Resort - Select Hurghada günstig buchen | billareisen.at
Search All of Craigslist: A Comprehensive Guide - First Republic Craigslist
Craigs List Palm Springs
Simnet Jwu
Craigslist Mendocino
Sky Dental Cartersville
Horseneck Beach State Reservation Water Temperature
Barber Gym Quantico Hours
Shiftselect Carolinas
Ret Paladin Phase 2 Bis Wotlk
San Pedro Sula To Miami Google Flights
Latest Posts
Article information

Author: Laurine Ryan

Last Updated:

Views: 5664

Rating: 4.7 / 5 (57 voted)

Reviews: 88% of readers found this page helpful

Author information

Name: Laurine Ryan

Birthday: 1994-12-23

Address: Suite 751 871 Lissette Throughway, West Kittie, NH 41603

Phone: +2366831109631

Job: Sales Producer

Hobby: Creative writing, Motor sports, Do it yourself, Skateboarding, Coffee roasting, Calligraphy, Stand-up comedy

Introduction: My name is Laurine Ryan, I am a adorable, fair, graceful, spotless, gorgeous, homely, cooperative person who loves writing and wants to share my knowledge and understanding with you.