Your Data Cloud Dictionary
Wondering how to navigate the Salesforce Data cloud interface? Data Streams, data injection, Data lake, etc. Well, what do all these terms mean?
I gotcha! Here is the CDP Dictionary detailing what each term means, in the correct order of execution..with some bonus notes!
Let's start.
1) Data Sources:- Data stored in different systems like Amazon S3, Salesforce CRM, Marketing cloud, Azure, snowflake, website or mobile app etc that we want to pull into salesforce CDP.
2) Connector :- A way to allow two or more systems to communicate with each other.
Data cloud can connect to home org ( current SF org where data cloud is provisioned) or an external org including sandbox.
Notes:
- As of today, Salesforce connector updates data to the data cloud every 1 hour and fully refreshes every other week.
- The level of access that the connector user/integration user has will be the same as the level of access that data cloud has over salesforce's objects/records. Use permission sets in salesforce to increase the connector user's access in SF. 'View' permission will suffice.
- We cannot bring a History object/ Audit log in Data cloud from SF.
3) Data Streams :- The means to get different datasets from the source system into CDP. Each external data set creates one data stream and each data stream creates one data lake object.
Data stream categories:-
a) Profile : Select if data is about individuals. We can only create segments on profile data.
b) Engagement: Select if data is about behavior or engagement data like readings from a website. There is always a timestamp field in this category.
c) Other:- Choose if your data falls in none of the above categories, like product details.
4) Data Lake Object :- Objects in Data Cloud which are created when data streams bring in data from external sources. DLO's store the data from the external object.
5) Data Explorer :- The interface in CDP to view the list of records present in each data lake object, data model object or calculated insights.
6) Data Transforms :- Appending fields, splitting data from one field into different fields, uppercase modification, Merging DLO'S etc are some functions we can do on the records of data lake objects so that we can later map them correctly into data cloud's data model object. example :- Data Lake Object has one field for address but you need to divide it into different fields like city, pin code, country etc for the final mapping.
7) Data modelling:- The process of mapping source data from DLO to data model objects. There are standard objects in the data cloud like accounts, contacts, brand, case etc which can be selected to map fields from the data lake object to data model objects. You have an option to create a custom field in data model objects.
IMP :
One Data Model object can be linked to multiple data lake objects. Not recommended though. Its better to merge two data lake objects using data transform and form a third data lake object and map it to a single data model object.
One data model object can be linked to other data model objects. Highly recommended as it will make querying easy.
8) Data Spaces :- Data spaces are like business units under a data cloud. Example, in a company USA region and UK region can have different data spaces, Likewise records of different subsidiaries can be stored in different data spaces.
Create a data space and add a criteria to add records from selected data lake objects to it.
9) Segments :- Once we have our data properly formatted and stored in the system under the desired data space we can create segments on our data. Segments is a group of data categorized together based on a criteria.
10) Activation targets:- A link to an external system where we want to use the segments created in the data cloud. We can use the created segments to send data to different platforms such as google ads, marketing cloud, AWS S3, facebook or data cloud.
While creating an activation target we select which platform the activation target will send data to.
11) Activations:- Created from a segment's page, it acts as a link between a data cloud segment and an Activation Target. In this we select the activation target and give our segment a name by which to identify our segment in the other system.