","\f Preface xix","When we wish to draw your attention to a particular part of a code block, the relevant
lines or items are set in bold:","Any command-line input or output is written as follows:"," ","Bold: Indicates a new term, an important word, or words that you see onscreen. For
instance, words in menus or dialog boxes appear in bold. Here is an example: \"To do this,
navigate to the User Settings menu and select Developer Settings.\""," Tips or Important Notes"," Appear like this.","\fxx Preface","Get in touch
Feedback from our readers is always welcome.
General feedback: If you have questions about any aspect of this book, email us
at [email protected] and mention the book title in the subject of
your message.
Errata: Although we have taken every care to ensure the accuracy of our content, mistakes
do happen. If you have found a mistake in this book, we would be grateful if you would
report this to us. Please visit www.packtpub.com/support/errata and fill in
the form.
Piracy: If you come across any illegal copies of our works in any form on the internet,
we would be grateful if you would provide us with the location address or website name.
Please contact us at [email protected] with a link to the material.
If you are interested in becoming an author: If there is a topic that you have expertise
in and you are interested in either writing or contributing to a book, please visit
authors.packtpub.com.","Share Your Thoughts","Once you've read Automating Salesforce Marketing Cloud, we'd love to hear your thoughts!","Please click here to go straight to the Amazon review page for this book and share","your feedback.","Your review is important to us and the tech community and will help us make sure","we're delivering excellent quality content.","\f Section 1:"," Automation Theory"," and Automations"," in SFMC","In this first section, you will get an understanding of the main ideas behind automation
theory and the automation-related components inside of SFMC.
This section contains the following chapters:"," • Chapter 1, What Is Automation?"," • Chapter 2, SFMC Automation Tools"," • Chapter 3, SFMC Automation Best Practices","\f 1"," What Is"," Automation?","Automation is something that has become so ingrained in our society that we almost take","it for granted. It is in our daily activities and helps us to achieve so many of our wonderful","and innovative inventions. The odd thing though is that despite automation being such","a day-to-day thing, not many people can actually explain what it means or where it came","from. To help get us all in the right mindset, we first wanted to go over what exactly","automation and automation theory are.","In this chapter, we will cover the following topics:"," • Automation theory: A dive into the history of automation, the theory behind why
automation is useful, and how it should be used
• Defining automation: Solidifying the meaning behind our focus in the book by
defining exactly what an automation is
• Automation opportunities and pitfalls: A broad overview of the different benefits
and potential risks that go along with automation
• Concepts of automation: Basic concepts related to automation and the areas
they affect
\f4 What Is Automation?"," • Implementation best practices: Some basic best practices related to automation when
looking to implement it
• Always Be Documenting (ABD): A case for why you should be documenting each
automation and solution you create","With this knowledge, we will be prepared to move forward and fully digest the automation
capabilities and possibilities related to Salesforce Marketing Cloud, as discussed further in
this book.","Automation theory
In some ways, automation theory is a self-feeding theory in that as each advancement
happens, it will use that step as a building block to making further advances to larger
and more significant feats. This means that before we dig into the exact meaning of what
automation is, we should look at the history of automation and automation theory over
the years. This rich history is what built the automation theory and capabilities that we
know and love today, and will give great insight into the future of automation as well.
From the development of the first computer, all the way to recent advances in AI,
automation theory has had a significant and life-altering impact on humanity and history.
Each one of these developments and inventions has helped forge the way for further
automation and efficiency in an exponential way. These advances have created a steady
push forward to remove all the manual aspects of any activity or process to help improve
the speed, efficiency, and possibilities of the results.
Heck, we could probably even go all the way back to the very beginnings of mankind's
rise to find the roots of automation theory. The very first automation could have been
something as simple as the repeating loop our roaming ancestors took when hunting and/
or gathering or even utilizing a tool or their environment to remove repetitive actions in
their day-to-day life. Automation does not need to involve technology as we view it today!
Without automation theory, the computer would never have existed, changing the entire
world we live in today. And that is only the most recent example of how automation has
changed our lives. Look at the world's first moving assembly line for mass production
created by Henry Ford (https://corporate.ford.com/articles/history/
moving-assembly-line.html). Without that innovation, our entire world would be
irrevocably different.
\f Automation theory 5","Automation theory has helped advance and shape technology and theories throughout the
years, including such recent advances as the following:"," • Control functions (electronic digital computer)"," • Programming languages and capabilities (computers and machines)"," • A vast array of sensor technology (light, electromagnetic, kinetic, and so on)"," • Advanced mathematical theory of control systems (evolved during WWII)"," • Artificial intelligence (robotics and learning systems)","These advances have set the stage for the highly interconnected and digital world that
we experience today. The simultaneous development and maturation of many
technologies, in particular, information technologies, have enabled vast levels of
interdependence and communication between previously disparate systems that
were impossible before. While not a new feature of this development, automation has
recently taken center stage in discussions about the future of business, communication,
production, and many other aspects of our lives.
To help understand this, I wanted to provide a quick visual representation of automation
theory as we just described it."," Figure 1.1 – Visual representation of automation theory","Now that we have a strong understanding of the history of automation and how it has","affected our lives, let's figure out what exactly it is.","\f6 What Is Automation?","Automation definition
The dictionary defines automation as the technique of making an apparatus, a process,
or a system operate automatically (Merriam-Webster: https://www.merriam-
webster.com/dictionary/automation). That's a bit generic for our purposes, so
let's consider it in an IT context so that we can make that more applicable to this domain.
Automation is the use of repeated instructions to create a process that replaces previously
used human intervention in order to accomplish a task. There, that's a bit better, right?
It has become a buzzword in our modern era, and its application has re-shaped the ways
that we interact and live within our world. From home appliances to robotic rovers on
other worlds, automation has found its way into the fabric of our technical and social
world. In a world driven by speed, connectivity, and efficiency, it's not hard to see why it's
become such a central focus of many organizations and technologies today.
One of the more obvious consequences from this transformative concept, and one relevant
to this book, has been the dramatic shift in the ways that both business and individuals
communicate and interact with each other in emerging digital spaces. As the rate at which
aspects of our lives occur within these spaces increases, so too has the expectations for
instantaneous communication and services tailored to the personalized needs of individuals.
This has presented challenges for sure, but has also created tremendous opportunities for
organizations that have embraced this digital transformation and have sought to redefine
their business and processes to operate in this space.
This digital transformation has been critical for businesses to meet emerging customer
demands and expectations, and has become a requirement to compete in this new global,
digital world. This does not come without cost, however, and businesses today must
be more cognizant than ever before of how their organization uses technology, both
internally and externally.
One of the key components and performance multipliers in digital transformation is
automation. Incorporating automation as a central feature in business processes, aside
from introducing an innovative mindset to an organization, introduces efficiency in costs
and performance that can have dramatic impacts when well planned and thoughtfully
applied. In that vein, let's move onward to learn about the different opportunities and
pitfalls that come along with automation.","Automation opportunities and pitfalls","Before diving further into the concept of automation, it's helpful to consider exactly why it","can be such an important component of business processes and what risks are associated","with its implementation.","\f Automation opportunities and pitfalls 7","First, automation is essential for reducing both financial and productivity costs associated
with repetitive, time-consuming, and error-prone tasks. These could range from manual
data-entry processes to complex monitoring or communication activities. In addition to
its impact on these sorts of tasks, when well planned, it can introduce a single standard
of quality that can be replicated and adhered to over time.","Opportunities
Let's take a deeper look into the benefits that can be extracted from incorporating
automation into your business processes. First, I wanted to start with a quick visual to
show the benefits and then from there, we will dive deeper into each aspect."," Figure 1.2 – Visualization of the benefits of automation","As you can see in each of the bars from the visual, there are a ton of great opportunities","available. Now, let's take a deeper dive into these benefits.","Increased efficiency
One of the simplest benefits of incorporating automation is that it increases the efficiency
with which tasks or events can be completed. When repetitive or time-intensive tasks are
eliminated, it frees up resources to do more innovative and impactful work. This provides
an immediate cost benefit as resources can be allocated to other impactful projects while
also reducing the strain on employees, all without a loss in productivity.
\f8 What Is Automation?","Its impact in a customer-facing context can be even more beneficial. As customers demand
real-time engagement and services, being able to meet them in the moment is critical.
Whether it's automated communication for a password reset request, or a one-click online
checkout, creating sustainable automated processes that can deliver on customer requests
immediately is a massive benefit and has become an expectation in our digital world.","More reliable solutions","When it comes to any solution or project, consistent and reliable quality is an important","factor for success. With manual solutions, where human input and decision making are","ever-present and necessary for the completion of a task, there are risk factors that will","inevitably impact the overall stability and reliability of the performance. Humans, while","our capacities for attention and skill are prodigious, are subject to factors that will","reduce work quality and performance. Whether it's a poor night's sleep, long periods","of intense focus, or multi-tasking, attention to detail and performance will vary across","individuals and circumstances.","This limitation provides an opportunity for automation. A well-defined, automated","process suffers from no degradation of attention, nor experiences performance dips due","to fatigue or distraction. In an interconnected global space, having an automated system","that can operate 24 hours a day to meet challenges and complete tasks is important to","a process that needs to be responsive on demand.","Expanded opportunities
While not initially apparent, automation can actually improve efficiency and
productivity across teams in an organization by exposing the nature of your current
business processes. Before a system can be automated, those underlying processes must
be thoroughly analyzed in order to both define opportunities where automation is feasible
and to define what it's meant to accomplish.
By exposing these underlying procedures, it encourages conversations about the current
goals and future state of the program, which can help improve the overall quality and
foster innovation.
In addition to this, automation can work as a centralizing force for many disparate
processes that may otherwise be isolated and poorly understood. By combining data and
workflows into a centralized system that is capable of both producing and receiving input
or completing a task, you can act on events automatically and extend the capabilities of
your solutions while increasing productivity.
\f Automation opportunities and pitfalls 9","Pitfalls
Now that we've covered just a few of the benefits that automation can bring to an
organization, it's important to also consider the potential risks or downsides that
can come from incorporating this as well.","High initial investment","It has been said that nothing in life is free and automation, unfortunately, is no exception.","First, there are the obvious financial costs that come with implementing automation","solutions. Whether it's purchasing cloud storage, training for teams to learn new skills","and adjust to new workflows, or just the amount of planning and development needed to","implement the solution, the decision to automate a process should not be taken lightly.","Secondly, it is important to consider the opportunity and productivity costs that take away","from other current or planned initiatives. Attention is a finite resource and draining it for","a new initiative comes at the expense of other priorities.","When you carefully consider the amount of planning, development, testing,","documentation, and training that can go into proposed automation, it can become","daunting to undertake for some, and perhaps impractical for others.","Less ability to pivot quickly","We all know it's a perfect world, and all your solutions and initiatives are performing","so well it's time to just set it and forget about it, right? Er, maybe not.","Even with the proper planning and strategies when developing automated solutions,","unexpected shifts can leave your well-defined process at a disadvantage. Perhaps it's an","urgent marketing need, or a new legal requirement that must be implemented to stay","compliant, but automation can make you inflexible at an inopportune time.","By their nature, automated solutions are likely narrow and well-defined, which can leave","them vulnerable to sudden shifts in goals that invalidate the existing implementation.","You get out what you put in","It seems self-explanatory, but your solution is only as good as the strategy and the team","behind it. Failing to adequately account for edge cases, resourcing, stable deployment","processes, or any number of other factors in the project life cycle can have a significant","impact on the overall performance of the automated solution.","\f10 What Is Automation?","While it's been noted that automation can increase the stability and quality of your
workflows, it should be remembered that you get out what you put into it and poorly
designed automation can cascade issues much more dramatically than a manual process.
Now that we've taken a look at some of the general advantages, and disadvantages, of
automation, let's take a look at some select core concepts so we can get a better grasp
of what automation looks like exactly.","Concepts of automation
Many individual concepts within automation help define what it encompasses technically.
In this section, let's focus on a few that are more applicable to common business scenarios
or solutions within Software as a Service (SaaS).
To help visualize these concepts, please reference the following diagram:"," Figure 1.3 – Visualization of the concepts of automation","Now that we have seen the visualization, let's take a deeper dive into exactly what these","concepts mean.","Infrastructure
This is an important piece of the automation puzzle, and one not to be taken lightly.
Thankfully, most SaaS will largely take care of this issue upfront. You begin provisioned
with a powerful cloud solutions environment that can scale to your needs and contains
robust APIs for interacting both internally and externally with the platform. Problem
solved? Yes, but also no.
\f Concepts of automation 11","First, let's explore exactly what an API is. API is an acronym for Application Programming
Interface, which is basically a middle software that allows two applications to talk to each
other. APIs are essentially messengers for your two applications. I like to view it as if the
API is a phone call between two friends. You can share information, receive information,
make plans, cancel plans, and more via a phone call. Although you each have phones,
voices, and so on, without the connection provided by the call, the API, you could not
connect and communicate these messages without direct interaction. Now, back to the
SaaS infrastructure.
While most SaaS platforms do resolve a lot of issues around provisioning and
configuration, it may only be a piece in your larger workflow. Perhaps there is a business
need for automating some communication with an external service at dynamic points
in a customer life cycle? While some built-in tools can provide us with a way to track
milestones in a customer journey, integrating custom services or solutions can require
more complex solutions that require outside integrations.
Maybe you want to utilize your SaaS for individual functions within a larger automated
workflow for your organization and not as a standalone product. Considering the
road that your solutions run on is an important step in planning and designing
a technical solution.
When selecting the environment that will house a given component of your solution,
consider how well it integrates with the other platforms you've defined. Knowing how
your systems are going to talk to each other is a key step of the planning phase and can
drive the direction of your technical solution.","Triggers
Triggers are the events that flag an activity or process from some response or action by
your automated solution. These can be initiated by both humans and machines and are the
primary method by which your automated solution is compelled to complete some tasks.
A common form of triggers in most SaaS platforms is API entry points. Usually, when
you call an API into the platform, it comes in with information that is used once the call is
received (the trigger to take an action) to create a new action or process, such as returning
data from the platform to the requestor.
Another example might be a form hosted on a website, and integrated with an external
service or custom solution. The user triggers the event by submitting the form on the
website, and an automated solution is called by that event to take some action (say, send
an email confirmation).
\f12 What Is Automation?","These types of events can take many forms and differ in both type and the complexity of
the action being triggered. For a webhook, like our form submission scenario above, some
event is triggered automatically when an event has taken place in our environment.
Using something like an API, for instance, requires data to be requested and for
a response to be generated based on the validity of that request. The main takeaway here
is that we need to keep in mind that there is a wide range of possible trigger events, but
the core concept remains consistent.","Data collection
Data collection often involves a form being completed, but it can also involve data
being automatically extracted or received from a system. For most automated processes,
the validity and robustness of your data store can define the scope or functionality of
the solution.
While there are use cases where a method of data ingestion or collection is not needed,
and the trigger itself is the only required source of action, most processes will require
some form of data collection to generate an automated solution.
Planning how your data will be ingested and stored should be a priority when developing
the flow of your automated solution. Ensuring your data store has a consistent structure
and schema, and that the data you're storing is properly formatted and reliably populated,
are both key to ensuring that you can retrieve it reliably when needed and that it will
integrate well with your solution.
Important in this regard will be data validation, particularly when capturing or importing
data from user-input or external services. Your solution will only be as good as the data
you collect for it, so take care to make sure it's as clean and consistent as possible.","Information routing
Information routing involves moving data between people or systems where business
rules and logic dictate where data needs to travel next. This is a common concept that
can be found in solutions both internal and external to most SaaS platforms. When
implementing a solution that utilizes complex data flows, containing multiple sources
or destinations, understanding the routing necessary to seamlessly pull it together into
a final source requires careful consideration of both your individual components and of
how they depend and interrelate with one another. Sequencing, timing, and monitoring
are especially critical and can be the difference between a successful or failed program.
\f Implementation best practices 13","For external solutions or those that involve multiple systems integrating within the same
core initiative, it becomes even more important to understand the information routing
occurring with a solution. In this scenario, you're accounting for the flows across various
services that are subject to failures, timing inconsistencies, and misconfigured trigger
events, among others.","Activity tracking
How can we define the success or failure of a program if we don't have a reliable way of
tracking its performance against our defined goals? Tracking what happens from end to
end in a process allows processes to be audited and measured in order to improve the
solution or otherwise highlight points of failure that make it ill-suited for automation.
When integrating with external systems, this becomes even more critical, as there are
more points for failure that must be accounted for and tracked reliably at the request level.
Taking a proactive approach to logging, in addition to constructing your solution for ease
in reporting and accountability, can help mitigate missed errors that propagate or obvious
points for remediation and revision.
With these core concepts in hand, let's take a look at some common best practices to
consider when building automated solutions.","Implementation best practices","As we've indicated previously, while automation can be a powerful multiplier for efficiency","and productivity, there are no guarantees. Poor planning or implementation can lead to","solutions that fail to meet the need at hand or, worse, negatively impact your initiatives","as a whole. While many important aspects of the automation development life cycle can","dictate the overall success of the project, by far the most important step is planning","and strategy.","Moving forward without a clear idea on both your objectives, and detailed solutions","to meet them, can lead to runaway scope, poorly designed systems, and unnecessary","complexity. Let's take a look at a few best practices to keep in mind during this crucial","step of the automation process.","Starting small
While it may be enticing to jump head-first into automating large and complex portions
of your processes, the universally accepted best practice is to start small and target those
items that will return the highest value for the lowest level of effort. Work with related
teams to identify processes that are time-consuming or error-prone, particularly those
that are both simple and repetitious.
\f14 What Is Automation?","Finding routine, manual processes that are low-risk and high-visibility ensures that you're
both providing real value with your solution while not impacting your current initiatives
in the event that something goes awry. Processes with low exception rates, that do not
require human intervention, should be considered in this regard as well.
Ideally, you should consider those processes that are stable, well documented, and with
well-defined risk and cost. Getting buy-in from internal teams is important as well, so
looking for tasks where the savings from automation can be quantified and measured
can drive support and adoption to get your solution into production.
Try and avoid the Rube Goldberg effect when an automated solution becomes so complex
that no one but its creator has a hope of understanding its flow or intent. Processes that
are atomic, reusable, and simple on their own are ideal.","Understanding the process","The term flying blind has a negative connotation for a reason. Approaching your planning","and strategy without an adequate understanding of the systems, or processes, that you will","be automating is a recipe for failure. Understanding how a given process is structured end","to end, including how it operates both internally as well as with external systems, is crucial","to delivering a successful solution on time. A failure to account for some factor during","your planning stage can derail a project, or kill it completely, if your solution no longer","becomes viable or runs over the projected costs.","Understand the process well and why each of its steps is performed in its current","sequence. It is important to know why things are performed in their current state as they","might not be suitable for automation, or only partially suitable. It can be counter-intuitive","to automate four tasks when only two of them make sense and the other two could","potentially even increase workload via automation.","Rather than fully automating in bulk, automating 'helpers' as part of a process can be","vastly more efficient than forcing a full process to be automated. Sometimes, checks and","balances are required that just simply cannot be automated. Trying to force this into an","automated process could greatly increase the risk or increase the level of effort on other","teams to do the manual checking amid the automation, and so on.","Sticking to the objective","While automating data management, email deployment activities, or custom workflows","and processes, you must always stick to the original objective outlined in your planning and","strategy process. The introduction of a new business need or functionality, or new feature","within the platform, can be alluring to attach to a project still in development, but it is","important to stay focused on the initial needs and requirements outlined at the beginning.","\f ABD – Always Be Documenting 15","By staying on track with your initial scope, you eliminate the possibilities of new additions
either slowing down your implementation or derailing it altogether. Adding additional
functionality in the middle of a project introduces unnecessary risks and has the potential
to change the very objective of the solution you've previously outlined. So, while it might
be tempting to turn your next data management process into a full-fledged machine
learning solution, it's best to avoid this and to look for areas to enhance or branch for
future projects.
These are just a few of the very important considerations and practices to take into
account when getting started with the planning and strategy portion of the automation
project life cycle. Obviously, there are other important practices to keep in mind during
the development and testing phases of your project that have been touched on earlier in
this chapter. One more that we should call out here is testing.
While you may think that your well-planned and developed project is immune to failure,
you must identify likely exceptions and develop a testing strategy before you even begin
your development. Having a solid testing plan and execution can reduce the error rate of
your automation processes and can help drive more adoption and innovation across other
processes in your organization.
Now, let's cover another best practice that is critical to the development and continued
success of an automation project: documentation.","ABD – Always Be Documenting","You've written well-thought-out and concise code, so you don't really need to document","it right? Wrong. We're all familiar with the experience of reviewing the code comprising","a feature, where the nature of its structure, or even its intent, isn't immediately obvious.","Worse, we may not know how it integrates with other systems or parts of the solution. By","not documenting your code and configuration, and how it integrates with other parts of","your solution, you've isolated the group of people familiar with it to a single person,","or small team, while introducing the risk that someone may unwittingly impact your","solution or services it relies on.","The primary reason that documentation is ignored is because of time, and conflicting","priorities. Development doesn't occur in a vacuum, and it's not often that we have the","time to stop everything and focus on documentation for our solutions. Apart from","designing and development, we also have to consider unit testing, user acceptance criteria","and quality assurance, and code reviews (among others).","Documentation is easily pushed to the side for other priorities, and not considered","essential. In reality, it can be one of the most important factors for the final, and continued,","success of a solution.","\f16 What Is Automation?","Regardless of the task you are automating, it is very likely that you or your team will
have to revisit the solution at some point in the future. The purpose of certain blocks of
code, exceptions that need to be considered, or even the configuration itself might have
faded with time in the minds of the author and those new to the project as a whole may
be completely lost. By not documenting your process thoroughly, you incur additional
costs in time and effort when revisiting your solution requires additional time to unravel
its meaning or purpose. Worse still is the fact that you add additional risk by missing
a key component that, while common knowledge during your development phase, has
been forgotten.
As developers, when approached with a new system or project, our first inclination is to
likely seek out and review the documentation. Next time someone wants to understand
your solution, you can simply point them to your documentation. It saves time and effort
for you and gives them a helpful reference so that they can self-learn without being
dependent on your time and availability.
In addition to this, documentation can make you both a better developer and team
member. When working on an automated solution, it can be very easy to get tunnel-vision
with a specific configuration or block of code, and how that holistically fits in with the
overall structure of the project may be lost. The process of creating documentation during
development ensures that you're always keeping the purpose of your project in focus, and
it shapes the way that you create the individual components of your solution. There is also
an easy reference for yourself, or your colleagues, that can aid in collaboration among
team members and increase both the stability and quality of your solution.","Documentation best practices","Now that we understand some of the advantages of documentation, let's take a look at","some best practices that will make it more useful to those reading it.","First, understand who the audience is for your documentation. Will it be other developers,","or are you providing supplementary documentation for marketing or business teams?","Including detailed code blocks and descriptions of their functionality may be critical for","developers, but it's sure to drive away non-technical resources on your team. Structure","your documentation to match the expectations of its intended audience in order for it to","be both readable and effective.","\f Summary 17","Secondly, create a short, but descriptive, summary of your solution that explains the
main purpose and intent of the project as a whole. This will help readers derive the
purpose of the solution that you've automated as well as its relevance to their work
or other business processes. Also, be sure to provide a description of the main components
of your solution, noting any dependencies that may exist within your project. If your
solution relies on APIs or third-party libraries, be sure to include their versions inside of
your documentation as well.
Be generous with your coding examples when writing documentation for developers.
Having a detailed explanation of what a given block is meant to accomplish, and how
to both use and test its functionality, can save everyone time and effort when reviewing
individual components in your solution. This will also make your code more readable as
its intent will be clearly stated, and your solutions will rely less on inline-commenting
or naming conventions to express their intent.
In summary, while it can be seen as a distraction or non-critical for a solution,
documentation plays a key role in both the development and sustainability of your
solution going forward. By empowering yourself and your team with a clear, and detailed,
reference, you're incurring future savings of time and effort and ensuring that your
solutions are more stable and of higher quality.","Summary
We have now reached the end of our first chapter! This was a lot of fun so far, and I hope it
was the same for you. You should now have a solid understanding of not only what exactly
automation and the theory of automation is, but also the general concepts associated
with these. After acquiring this knowledge, you should now have a general idea of the
opportunities and pitfalls of automation and best practices for implementing it. After that,
we wanted to emphasize the high importance of documentation on automation and the
associated best practices.
With these new skills and knowledge gains, we are ready to move forward to the next
chapter, where we will begin our journey into exploring Salesforce Marketing Cloud
and the automation tools available inside it.
\f 2
SFMC Automation
Tools
Every good SaaS worth its salt has automation capabilities. They have moved from being
a cool feature to being necessary for success. Any good system will offer a variety of
options to help optimize or create efficiencies in workflows. Whether they are baked-in
and user-friendly or more developer-based and require technical skills and knowledge,
there are almost always some automation capabilities. This could be a ton of different
things, such as the following:"," • A drag-and-drop scheduling tool"," • A canvas-based pathing and logic tool"," • Server scripting capabilities"," • Widget functionality"," • API capabilities","Each one of these gives you the ability to interact with the platform outside of general UI
capabilities, allowing for efficiencies and automation. This statement is pretty much true
regardless of whether those platforms focus on data, messaging, or one of the million
other services that can be offered.
\f20 SFMC Automation Tools","For this book though, we will be concentrating on automation in relation to just one
specific platform – Salesforce Marketing Cloud. We will explore many aspects of this
platform, such as the following:"," • Salesforce Marketing Cloud: A brief history of the platform and details on what it is
and what it does
• Multi and cross-channel marketing: Look at what multi-channel and cross-channel
marketing is and how they are different
• Automation tools in Marketing Cloud: A dive into the built-in tools inside of
Marketing Cloud
• Journey Builder overview: A broad overview of what Journey Builder is in regard to
Salesforce Marketing Cloud
• Automation Studio overview: A broad overview of what Automation Studio is in
regard to Salesforce Marketing Cloud
• Comparing Automation Studio and Journey Builder: A review and comparison
between Journey Builder and Automation Studio","Learning about these tools and aspects will provide us with a strong base to build out our
automations. The more we learn about where the platform came from, the history of the
tools, and the details of how each tool works, functions, and interacts with others, as well
as the platform itself, the more we can do in the future.
Let's not get too far ahead of ourselves yet though and instead of thinking about the
future, let's dive a bit deeper into the past and learn about Salesforce Marketing Cloud and
where it came from.","Salesforce Marketing Cloud","Salesforce Marketing Cloud (SFMC) is a Salesforce platform that concentrates on","multi-channel messaging and 1:1 customer journeys. Marketing Cloud originated back","in December of 2000, then called ExactTarget. This tool was originally focused solely on","email marketing messaging only and started fairly small and simple. From its beginning","as ExactTarget down to the current Salesforce-owned iteration, Marketing Cloud has gone","through many forms and changed direction more than once.","The ExactTarget years","The platform slowly grew as time went on as it was a stable and versatile option with","customer service that was above and beyond what was offered elsewhere. ExactTarget","became a hit and continually grew and grew year by year. In late 2009, ExactTarget went","international and established an office in London.","\f Multi-channel and cross-channel marketing 21","Through this powerful forward momentum, ExactTarget was able to purchase Pardot
and iGoDigital among others in 2012 – increasing revenue and capabilities exponentially.
These acquisitions gave ExactTarget more capabilities in multi-channel marketing and
integrations. The platform shifted from just email to a more marketing hub-focused
system, offering a lot of different capabilities and functionalities that placed them as
a market leader.
This got ExactTarget a lot of attention and so the bigger companies started recognizing the
potential. This is when Salesforce got involved and purchased ExactTarget in 2013. This
was where the name Salesforce Marketing Cloud originated as the tool was rebranded and
changed to fit within the Salesforce ecosystem. The official rebranding happened in 2014.","Salesforce acquisition
As Salesforce worked to integrate the tool within its structure and model, there were
a lot of shifts away from the way things were in ExactTarget. Through these changes,
users gained capabilities such as a new pseudo-CMS capability via Content Builder,
new capabilities with data extensions, and integrations with other Salesforce Clouds
and products!
This platform grew from just a single focused email marketing messaging tool into
a world-class, top-of-the-market platform for building and managing personalized
multi-channel journeys and messaging. This included implementing SMS/MMS
capabilities, mobile push notifications, social media, digital advertising, and more.
These changes led Marketing Cloud further and further from being just an email
marketing platform and instead turned it into a full-service marketing tool that could
effectively ingest, implement, execute, and report on all aspects of digital marketing. This
vision and innovation have led SFMC to become a leader in the market and one of the
most popular marketing tools available. Now that we know a brief history of the tool, let's
move on to a look into multi-channel and cross-channel marketing inside the platform.","Multi-channel and cross-channel marketing","Before digging into Marketing Cloud's foray into multi-channel and cross-channel","marketing, I want to make sure we are all aware of what they are. Without knowledge of","what multi-channel marketing or cross-channel marketing is, it doesn't really make as","much sense or show the same level of impact, so I will give some quick insights into what","they are.","\f22 SFMC Automation Tools","Multi-channel marketing
Multi-channel marketing leverages multiple communication channels and mediums,
utilizing channels such as the following:"," • Email marketing
• Mobile SMS/MMS and push
• Direct mail
• Social media","Leveraging multiple channels allows you to unify the numerous interactions your
customer has with the marketing messaging they are receiving. Almost all messaging
services are now multi-channel marketing, but even just a couple of years ago, this was
a very strong selling point and something that only a few platforms had the capability to
do (and do well). The following diagram gives a good visual representation of what we are
talking about with regard to multi-channel marketing."," Figure 2.1 – Visualization of multi-channel marketing","Now that we have a good visualization, let's take a deeper dive into why you would want to","use multi-channel marketing.","\f Multi-channel and cross-channel marketing 23","Why use multi-channel marketing?","Now that we know what multi-channel marketing is, the question we need to ask is","why is it important? What sort of value could you receive for all the added effort and","organization required to utilize multiple different channels for your marketing campaigns?","Some of the major benefits are as follows:"," • Expanded reach through the utilization of multiple channels."," • The ability to communicate with customers through their preferred channels."," • The combination of channels helps to unify messaging and perception."," • Increased engagement through broader reach with a unified strategy.","Now that we have a strong understanding of what multi-channel marketing is and why we
would use it, let's move on to the very similar but also different cross-channel marketing.","Cross-channel marketing
Cross-channel marketing is essentially people-based marketing. This is because it focuses
on a combination of various marketing channels integrated together to form a more
cohesive customer journey for your target audience. Through this integration and
cross-pollination of messaging, you are able to let the customer control how they receive
your messaging. This is basically the next step after multi-channel marketing as it takes
that and further integrates and inter-connects messaging. Here is a good visualization
of cross-channel marketing and it helps to differentiate multi-channel marketing from
cross-channel marketing."," Figure 2.2 – Visualization of cross-channel marketing","\f24 SFMC Automation Tools","As to how this visualization helps to show differentiation, you will notice that in
multi-channel marketing, it all stems from the same place, but then sends the messages
out across multiple mediums without them being connected in purpose, whereas the
cross-channel marketing diagram shows that all the marketing mediums are being used
collaboratively to engage the subscriber, rather than engaging separately. Now that
we have learned what cross-channel marketing is, let's look at some of its uses.","Why use cross-channel marketing?","So, we now get what cross-channel marketing is and how it is different from multi-channel","marketing – but what does that mean for the benefits? What benefits does it offer and how","are they different from what multi-channel marketing offers?","Here are the major benefits of cross-channel marketing:"," • Optimized and holistic customer journeys"," • Deeper personalization and customer preference for a 1:1 experience"," • Increased engagement through more relevant messaging and touch points"," • All messaging on one platform for easy access for your marketers"," • Efficiencies and time saving for implementation and updating","With these benefits, I feel like we have found a good understanding of cross-channel
marketing and are ready to move forward with the story of Marketing Cloud as it evolves
beyond just email.","Marketing Cloud evolves beyond email","As with many companies in this market, Salesforce Marketing Cloud started as an","Email Service Provider (ESP) to help with your email marketing needs. While email is","a highly efficient and profitable messaging channel, there are other channels that can help","capitalize and optimize customer communications beyond just email. By focusing just on","email, you lose a ton of potential touch-points that may be preferable or more effective","methods of connecting with some customers.","\f Automation tools in Marketing Cloud 25","For these reasons, Marketing Cloud grew into multi-channel marketing through
acquisitions and new services/improvements being developed, such as the following:"," • Radian6
• Buddy Media
• IGoDigital
• CoTweet","This allows you to utilize the same platform for pretty much all of your marketing
messaging needs. The issue is that these were still pretty siloed, and although you could
create and use them on the same platform, it was still not fully integrated to utilize them
easily in tandem.
After some further improvements to the platform, they were able to build a more
cross-channel approach inside of their two built-in automation tools. Cross-channel
marketing is very different than multi-channel marketing as it takes it a step beyond
offering the service, into integrating each medium together to form a single, unified
messaging journey/flow."," Times They Are A-Changing"," As time goes on, the platform continues to grow. So, as the years go on, there"," is likely to be more and more added to it. At the time of writing this, I have no"," way of knowing what that will be. So, I highly recommend looking through"," Marketing Cloud release notes and official announcements for any changes"," or new features released after this book has been published to ensure you are"," working with the most recent and up-to-date information.","Now let's start to dig into the platform and see what makes it such an impressive tool.","Automation tools in Marketing Cloud","Part of what helped Marketing Cloud to grow to be so powerful and so capable was","the innovative ways they baked in automation tools. While most places preferred","their automation capabilities to be more developer-friendly and less marketer-friendly,","Marketing Cloud did both. They worked to allow marketers to have similar capabilities","to the developers and technical experts without the high-level pre-requisite technical","knowledge or skill necessary, while also further enabling the technical users' abilities to","create custom solutions and integrations within the platform.","\f26 SFMC Automation Tools","As these tools grew in capability and popularity, Marketing Cloud realized that these were
items that needed to be bundled into every single enterprise, rather than having them
as add-ons or optional. This way, companies could hit the ground running with some of
the strongest and most capable automation tools available and not have to worry about
further purchase orders or additional costs after the initial contract.
There are many different possible avenues that you can use to automate in SFMC.
You can custom-build automation via different third-party integrations and connections,
via raw API calls, internal scripting using proprietary languages, as well as marketer-
focused internal automation tools. A good portion of the more customizable options are
all developer-oriented, requiring a heavy technical skillset, but to counter this, Marketing
Cloud also includes two very robust automation tools that are more marketer-friendly and
require much less technical skill to utilize them well.","Marketing Cloud tools","For this part, we will be keeping the focus on the baked-in automation capabilities of","SFMC. Inside SFMC, there are two very distinct and powerful tools that can help us","automate and make our lives easier with little to no technical skill required to efficiently","operate. These tools are the following:"," • Journey Builder: Focused on 1:1 journey mapping and custom interactions."," • Automation Studio: Batch-oriented activities to increase performance, efficiencies,"," and capabilities.","Each of these tools is pretty well renowned and extremely powerful in different ways.
Automation Studio tends more towards the general idea behind automation, hence the
name. Although it does not require a technical skillset to be utilized, the more technical
knowledge you have, the more powerful Automation Studio becomes.
Journey Builder tends to be more trigger-based and 1:1 and focused more on pathways
and the integration of multiple channels and activities. It also has a much lower technical
knowledge and skill requirement to be utilized, making it less reliant on technical
resources. This opens it up to non-technical marketers with little to no reduction in
capabilities, allowing for a lower ramp-up and a lower-level pre-requisite skillset required.
In the next section, I will be going in-depth about what each of these tools is and will give
further details on them. Each of these tools will greatly help to make your marketing more
efficient and effective.
\f Journey Builder overview 27","Journey Builder overview","It's no surprise, in our current digital world, that customers both want and expect","companies to respond in real time to their needs. Organizations that can deliver on these","requests are at a significant advantage as they can build both brand loyalty and reputation","while strengthening their own bottom line.","So, how can we provide customers with real-time responses and engagement in the","moment? Enter Journey Builder. Journey Builder is an SFMC solution for providing","complex 1-to-1 marketing experiences across channels that can both meet and anticipate","customer needs so that organizations can deal with their customers individually in","the moment.","So, just how does this solution enable 1-to-1 real-time marketing experiences? To","understand that, we need to dive into the details of just what Journey Builder is and what","it does. First, let's take a look at the types of journeys that are configurable within the","Single Send journeys","The first type of journey that can be configured in Marketing Cloud is the Single Send","journey. This journey type allows for creating single, one-time, communications","for email, SMS, and push marketing channels. The primary feature to note with this","journey type is that it is limited only to a single instance of an activity, so repeated","communications are not possible with this journey. Another consideration is the type","of audience that can be utilized within this journey type. While email and SMS both","allow only the data extension entry sources, push communications require an audience","configured in Mobile Connect.","Use cases
Single Send journeys can be a viable use case when data is already easily available
or aggregated within Marketing Cloud and the needs for communication are both
immediate and relatively simple. While there are other solutions that can provide this
same functionality, such as Automation Studio, some users may find the centralization
of the content, entry source, and scheduling options more intuitive with the Journey
Builder interface.
\f28 SFMC Automation Tools","Transactional Send journeys","The Transactional Send journey type functions, more or less, as a UI implementation","of the Transactional Messaging API. Unlike the API, which features no configuration","or development settings with the UI, the Transactional Send journey type allows","non-technical resources to set up the necessary API endpoints that developers can use to","immediately send a transactional email message from the journey. While the entry source","must be configured as a Transactional Messaging API endpoint, and only email messages","are supported, there are some benefits of using this approach over more traditional","Transactional Sends in Email Studio. First, the Transactional Messaging API executes","a newer messaging system than traditional methods, which allows it to be both more","performant and able to handle more throughput consistently compared to traditional","triggered sends in the platform. A slight difference, due to this, is that the ability to select","a priority for a message has become deprecated. With this new approach, all messages are","deployed as soon as they are able, so you can be assured that every transactional journey","is reaching your customer as fast as possible.","Use cases
This journey type is ideal for any scenario where you need to enable real-time
transactional messaging and want to utilize this capability with reduced need for technical
knowledge to set it up. Within minutes, marketers can configure this journey to accept
entries, and enable development teams to start integrating sends directly with the journey.
Another added benefit is the inclusion of important metrics, integrated directly into the
UI, which allows users to monitor important send metrics such as ongoing error counts
and the depth of the current queue in the journey. Whether it's password reset requests
or order confirmation email sends, marketers will find this to be an easy method to get
their transactional real-time communications into production.","Multi-Step journeys
The final journey type, and arguably the most powerful and important, is the
Multi-Step journey. Unlike the previous two journey types, the Multi-Step journey
allows for maximum flexibility and control in regards to both the entry and activity types
that your journey implements. Journeys can be triggered from a variety of sources such as
data extensions, API events, real-time changes in Sales and Service Cloud, among others.
In addition to this flexibility in the entry source compared to the other journey types,
Multi-Step journeys allow true cross-channel marketing capabilities rather than a more
defined experience outlined in the other types. By allowing users to build journeys as
complex as their business needs require, and in the channel their customers prefer, this
has become the de facto journey type for most organizations and use cases.
\f Journey Builder overview 29","Use cases
Multi-Step journeys are particularly useful for complex or sequence journeys that need
to perform actions across channels or time periods or require more advanced integration
for entry or decision points across the journey. It allows for deep integration with the core
Salesforce platform that isn't possible in the other journey types, with users able to read,
create, and modify their CRM data in real time. This type also comes pre-configured with
several templates that capture the most common use cases that organizations encounter,
such as welcome series or abandoned cart journeys.
Now that we have some basic understanding of the various types of journeys that can be
created and configured within the Marketing Cloud, let's take a look at some of the global
configurations that aid marketers in both configuring and monitoring their lifecycles.
It should be noted that while many of these configurations apply solely to the Multi-Step
journey type, understanding their function can help inform both your selection
of journey type as well as the flow of your customer experience.","Journey Builder configuration overview","When building a multi-channel marketing journey, one of the most important distinctions","to be aware of are the two types of data that can be actioned. This distinction can make","or break your journey implementation and has a large impact on the decision-making as","to how your flow is structured and any data dependencies are executed. Let's take a look","at each data type and how they differ.","Types of data
One of the most important considerations for marketers, and developers alike, is how
and what data will drive their journey processes from entry to exit. Understanding
exactly what data you need, and how your internal and external processes can support
it, underlies every touchpoint in the journey lifecycle and can help drive strategy and
technical discussions even in the planning phase of your projects. In Marketing Cloud,
there are two data types that must be understood in order to effectively plan and build
journeys: contact data and journey data.","Journey data
When creating the entry source for your journey, you need to define the exact data fields
and values that will be necessary to meet the criteria for entry. In addition to this, perhaps
you have some immediate action that will occur, such as an email being sent, where some
critical data is needed to effectively action some activity. With such a small window for an
external integration or data process, to pass data to Marketing Cloud, it's important that
we have a way of retrieving those values the moment a contact has been entered into the
journey. This is where the concept of journey data is key.
\f30 SFMC Automation Tools","Journey data is a snapshot of data that is captured upon journey entry and is available to
be retrieved and compared at any stage within the journey. While you may access these
values at any time, data will remain static for the contact throughout the experience in the
journey and will reflect the values as they were captured on entry. This allows us to action
activities immediately on journey entry as well as utilize these values for later touchpoints
in the flow. Unfortunately, since these values are static, they aren't well suited to
accommodate certain use cases. Let's say we want to exit a contact who made a purchase
from our website after they had entered the journey. How can we action their purchase
data when our journey data for their purchase history is no longer valid? For that,
we'll turn to contact data.","Contact data
Contact data differs significantly from journey data in that it can be evaluated in real
time and is accessible across all journeys within Marketing Cloud. This is data that has
been linked to a Data Designer attribute group. Data Designer is a tool within Marketing
Cloud that allows you to organize and relate data within your Marketing Cloud instance
that resides in lists or data extensions. This can then be used within tools such as Journey
Builder as a real-time source for contact data in the platform. A possible use case is one
where we want to determine whether a customer has decided to complete an abandoned
cart purchase and thus might need to exit a journey built for that purpose. By linking
our purchase data source to Contact Builder, we can action purchase data that has been
altered since the customer's entry into the journey.
Understanding these two data types and their uses underlies much of what we can
build within Journey Builder. Everything, from entry sources to our flow and exit
criteria, is driven by one of these types of data, and knowing which to use in the
correct circumstance is fundamental to a working journey. With the knowledge of this
component in hand, let's take a look at contact entry configuration settings within
Journey Builder.","Contact entry
In addition to the type of data needed to act on in a journey, another important
consideration is the possible flows that a subscriber can take during the journey lifecycle.
Understanding your journey's purpose and both entry and exit flows is important in
deciding the possible paths that a customer can take in your journey flow.
\f Journey Builder overview 31","In some scenarios, a welcome journey, for instance, we might want our customer to enter
our journey only a single time as the messaging and intent are no longer applicable to
their needs after the journey has been completed. For another, where a single customer
may meet your journey entry criteria multiple times during the lifecycle unintentionally,
a system where we could prevent duplicate contacts from entering the journey is desirable.
Finally, we may want to allow customers to enter and exit our journey flow at any stage
and time. In this example, having no restrictions on our entry criteria is key.
To accomplish these scenarios, we can use the contact entry mode configuration within
the Journey Builder settings. There are three possible selections for these configurations,
and these three possibilities are as follows:"," • No re-entry
• Re-entry only after exiting
• Re-entry anytime","Each one of these three possible selections correlates to a scenario described previously.
We will now go into more detail about what each of these selections is.","No re-entry
In this mode, once a contact has entered our journey for the first time, they will not be
permitted to re-enter this journey again regardless of whether they meet the qualifying
criteria for our journey or are present in our entry audience.","Re-entry only after exiting","With this selection, a contact will be permitted to re-enter the journey flow, but only if","that contact is not already within the current journey. This prevents a contact from being","in multiple portions of the journey simultaneously, which might lead to a confusing and","poor customer experience for some use cases.","Re-entry anytime
This configuration removes any restrictions on the contact entry source. In this mode,
a contact is eligible for entry at any time regardless of their past or current entry points
into a journey. Due to this, it is possible for a customer to exit or enter at any time, and
they can be in multiple spaces of the journey lifecycle simultaneously.
\f32 SFMC Automation Tools","An errant configuration with these settings could lead to unfortunate circumstances that
impact the overall health and customer experience of your journey. Knowing how your
data integrates into your journey, and what the intent of your project is, can help inform
which of these selections is the most appropriate for your use case.
On that note, let's take a look at some of the tracking and auditing processes built into
Journey Builder that can help us stay informed and updated on the overall health and
performance of our journey.","Tracking and auditing","Another highly important part of Journey Builder is the capabilities it has around","tracking and auditing. These capabilities retain the same smooth, user-friendly interface","as the rest, allowing for dashboards and drag and drop utilization instead of technical","heavy implementations and modifications.","Goals
Using a journey goal can help you quantify the exact desired outcome that your journey
is seeking to accomplish. This could be for the customer to make a purchase in a given
period, engage with a specific set of product content, and any other metric that can be
used to gauge the success or failure of your journey intent. In Journey Builder, this can be
configured directly within the goal configuration module.
To configure a goal, ensure that your data source, which will determine the exact criteria
that will trigger the completion or failure of a goal, is linked to the Data Designer
attribute group and available for use as contact data in Journey Builder. On the primary
configuration screen, simply enter your filtering criteria that define your goal, and select
your target completion of this goal as a percentage of the total journey population or in
absolute numbers.
In addition to defining the goal criteria and metric baseline, you can also configure this to
automatically exit a contact once they have met your goal criteria. This ensures that you
can stay focused on the intent of your messaging and allow the best possible customer
experience once they have met the journey's intent.
\f Journey Builder overview 33","Exit criteria
Similar to the goal functionality, this feature allows marketers to define a set of criteria
that defines when the journey is no longer viable for a contact and to exit them. This
differs from goals in that it does not count towards your goal completion rate and allows
you to accommodate a more robust set of scenarios where contacts may perform some
action that, while it doesn't meet the stated intent of the journey, no longer provides
meaningful value from the messaging and intent of the current flow. Exit criteria will be
evaluated for each contact after they've exited a wait activity and, if they meet the criteria
defined in your configuration, they will leave the journey at that point.","Journey health
Journey health allows you to more closely monitor the status of your journey, both
individually and globally, in order to gain a more comprehensive view of the overall
performance and stability of your implementation. With this module, marketers are
able to view the overall status of their goals, by seeing the performance data and goal
criteria face-up in a convenient widget. In addition to this, users are also given visibility
of the historical audience counts of the journey, including the number of contacts that
are currently active within the journey at a given time. This is especially helpful during
auditing when a rough snapshot of the current audience can indicate issues impacting
customers before they are noticed retroactively. Additionally, there is a widget showing
the current exit criteria of the journey, as well as the number of contacts that have met the
journey exit criteria and have been removed.
There are two other important features within journey health that are very useful when
assessing the current state of your implementation and to work quickly to mitigate
issues. The first of these is the alerts widget, which contains data related to the number
of contacts who have exceeded their defined wait times within a wait step in the journey.
When a contact enters a wait step, Journey Builder calculates the appropriate end time
and holds the contact until that limit has passed, after which they proceed to the next step
of the journey. While this process has a reliable degree of stability, occasionally Journey
Builder can encounter delays in processing contacts for the remainder of the journey and
some may remain in the wait step longer than the pre-defined exit point. By using the
alerts portion of journey health, we can gain visibility of the size of this group and assess
the overall impact on the journey goal and performance.
\f34 SFMC Automation Tools","The other feature available in journey health is the contact path, which allows marketers
to search for the exact path taken in a journey for a given contact key. This is especially
critical in journeys that use complex segmentation, multi-path customer flows, or A/B
testing as it provides end users with a quantifiable way of viewing the path an individual
customer has taken, which might be readily available or understood in a journey with
sufficient complexity. In addition to this feature, it is also possible for users to remove
a given contact from the journey directly from the UI. This ensures that marketers can
respond quickly to customer complaints or errant miscues that might have an overall
impact on their journey goals and performance.","Journey analytics
In addition to journey health and goals, Journey Builder also comes equipped with
a convenient dashboard to monitor the performance of the overall messaging effectiveness
of your journey. With this widget, users can view important key metrics for their email
messages such as open, click, and unsubscribe, both as a global count and as a percentage
of the number delivered within your journey. Also, the global performance of SMS
messages can be measured in regards to deliverability and click performance both globally
and as a percentage.
While these are welcome additions to accurately assess the performance of a journey's
content, it leaves out some key metrics that we might be interested in, such as conversion
rate, related web traffic, or other engagement data that factors into a successful
implementation. In addition to our base analytics dashboard, there are additional data
views within Marketing Cloud that can be used to extract journey status information as
well as providing a convenient method for tying together your customer data to provide
a more comprehensive view of your performance beyond the standard capabilities within
the analytics dashboard. Journey Builder also features a powerful integration with Google
Analytics 360, which can directly integrate your journey messaging and reporting with
Google Analytics in order to provide a complete view of the customer lifecycle across
channels. While this feature is a premium service, there is also a free Google Analytics
360 integration that opens up some of these tracking and reporting features to all users
with Journey Builder. Also important to note is the support for tagging and reporting on
the GA4 property framework, which is available in both premium and free offerings on
the platform. While GA4 parameters are generally configured within Parameter Manager,
app events will need to utilize the Firebase SDK in order to be configured. Using these
features, users are able to configure relevant parameters for tracking and can easily view
conversion metrics, web engagement data driven from journey messaging, as well as other
key metrics used to track the performance of the journey.
\f Journey Builder overview 35","Now that we have a detailed idea of the overall configurations that are possible within
Marketing Cloud journeys, let's take a look at the available event types and activities to
really assess the capabilities of this tool.","Journey entry events and activities","It goes without saying, but you can't generate real-time 1-to-1 marketing journeys for","your customers if they have no method of getting into your lifecycle. As discussed in the","preceding sections on journey types, the availability of certain methods of entry will be","limited depending on the type of journey that meets your use case. Let's take a look at the","entry sources for journeys in Journey Builder."," Figure 2.3 – Examples of the entry sources available in Journey Builder","As you can see in the screenshot, there are a few different types of entry sources available","within the Marketing Cloud. While we won't analyze all possible journey entry types here,","let's take a look at a few of the most commonly used ones in order to gain a clearer picture","of how we can inject people into a customer journey.","\f36 SFMC Automation Tools","Data extension
The data extension entry source will certainly be a familiar concept to anyone who has
used Marketing Cloud to prepare data or send communications in non-journey contexts.
This entry source type allows marketers to use any sendable data extension in Marketing
Cloud in order to inject contacts into the journey in a batched process. While the batched
process does make this entry type less than ideal for use cases that require a real-time
injection method, it is arguably the most powerful entry source in Journey Builder due
to its ability to aggregate data across sources and channels for use within the journey. In
some scenarios, having the data necessary to actively reach customers working in concert
with each other at any given moment is not feasible. For those use cases, being able to
aggregate all of these different sources in a timeframe that meets capabilities is critical
and easily achieved by adding and segmenting them together in a single sendable
data extension.
To use this entry source type, first set up a sendable data extension within Marketing
Cloud that contains all of the necessary data attribute fields that you'll want to action
as journey data within your flow. Then, set up your data processes to load the desired
audience into this data extension. This can range from a simple ad hoc import into your
data extension or complex API integrations and automations that manage the flow in and
out of your process. After you've created your data extension, and have outlined your data
processes, you're ready to configure your data extension source. Note that, in email and
SMS Single Send journeys, this entry source activity is already included on the canvas but,
for Multi-Step journeys, you need to drag the entry source from the menu and onto the
journey canvas to begin the setup."," Figure 2.4 – View of the data extension entry source summary","\f Journey Builder overview 37","Once you've got your entry source on the canvas, you'll need to identify the sendable
data extension that you've set up with your journey data. Aside from simply selecting this
within the source configuration, you'll also be presented with the (optional) ability to filter
contacts. This allows you to utilize contact data on journey entry to further limit your
entry criteria and apply the filter to your selected journey audience. Simply navigate to
your appropriate data source linked to the Data Designer attribute groups and select any
number of filtering conditions necessary to accomplish your use case.
With the journey data source configured, we now need to set the schedule to determine
when the data extension should be evaluated to admit new contacts in the journey.
First, there is the option to Run Once. This essentially mimics the functionality of the
Single Send journey in the Multi-Step journey by allowing you to execute your journey
audience as an ad hoc source into your flow. The second option is to set up your journey
to evaluate the data extension audience for entry at a date-time-based interval. Similar
to Automation Studio scheduling, this allows you to run your entry source at intervals
that range from hourly to yearly. Finally, there is the Automation Schedule Type. This
will only be enabled when your entry source data extension is being actively used within
at least one automation in Marketing Cloud. To enable this feature, simply create a SQL
Query, data filter, or any activity that targets and populates the source data extension of
your journey as a step in your automation and save the configuration before returning to
Journey Builder. After that, the schedule type will become enabled and you can now select
your automation as the entry source for this journey. Once the journey has been activated,
a Journey Audience activity will automatically be appended to your automation flow and,
when run, will evaluate your data extension audience for entry into the journey.
If you've selected the Automation or Recurring schedule types for your journey, you will
be required to configure the Contact Evaluation method as well. In this configuration,
we'll need to let Journey Builder know how we want our contacts to be evaluated for
journey entry. The first option is to evaluate only new records that have been added to the
entry data extension since the last execution of our journey. This process is ideal for data
processes that will add new rows to a data extension, rather than overwriting or deleting
existing rows. Unless your data process to populate your entry source data extension uses
methods that overwrite or delete the data source, this is the preferred Contact Evaluation
method as it is more performant than the alternative. If your processes do overwrite the
data extension source, then you'll need to select the other configuration option, Evaluate
all records. With this evaluation type, all records in the data extension are evaluated for
entry into the journey when your scheduling event is triggered. This will result in a slower
processing speed compared to only evaluating net new records in your source, but it can
accommodate use cases for an overwrite process, making the most sense.
\f38 SFMC Automation Tools","API event
For this entry event type, we're creating a configuration that developers can use to
automatically inject contacts into our journey via the Marketing Cloud REST API in
real-time rather than waiting for batched processes. This allows for use cases where
some real-time action is desired to interact with a customer or perform some automated
function within Journey Builder. It should be noted that this event differs in both
functionality and configuration from the Transactional Send and Multi-Step journey
types. Events created in one of these configurations cannot be applied to the other as they
use different systems to manage the events and different API routes in order to trigger
them. See the following for a screenshot of the API event entry source in Journey Builder."," Figure 2.5 – Example of the API event entry source","Though these two API entry types do differ substantially in how they are integrated,","they share several similarities with regard to their configuration. When configuring an","API entry event source, you'll need to have a sendable data extension that will be the","written source of the records that you qualify for injection into your journey. In the","Multi-Step journey, you can select any configured sendable data extension while, with","the Transactional Send journey, you can only select those data extensions that have been","configured to be a Triggered Send type. These data extensions will contain the needed","references for your journey data within the journey flow and can serve as historical","records that can be utilized for quick visibility or error investigations.","In addition to the data extension source, you'll also need to configure an API event key.","This key uniquely identifies your entry event into the journey and allows developers to","integrate external systems to order to inject customers into the right journey with the","right data structure. With this API key configured, you're now ready to start testing","journey entry API calls to your entry source.","\f Journey Builder overview 39","Salesforce events
In addition to both data extension and API entry sources, another common and powerful
source type is Salesforce data. Unlike the previous two, this configuration source requires
Marketing Cloud users to have a Sales and Service Cloud instance connected to their
Marketing Cloud business unit via Marketing Cloud Connect. Also, for those users that
are connected to their Salesforce instance with the connector, only those users that have
been integrated into Salesforce on their Marketing Cloud user record can create or edit
Salesforce event sources. This can be easily done within the administrative menu under
the Salesforce Integration option menu."," Figure 2.6 – Example screen for setting up the Salesforce event entry source
This event source is quite powerful in that it allows Journey Builder to immediately action
events that happen in the core CRM system. Whether it's the addition of a new lead, an
updated object record, or a community-based audience, this event type provides immense
benefits for any organization that is well integrated with the Salesforce ecosystem. There
are three primary distinctions within Salesforce entry sources that allow us to action
different items on the platform:"," 1. The first is the Salesforce Campaign event, which allows users to configure
an event source that is triggered whenever a contact has been added to a Sales
Cloud campaign.
2. The second is the Salesforce Community event type, which can inject new members
for a Salesforce Community into their journey when they are added in Salesforce,
making it ideal for a welcome communication.
3. The third, and most powerful, is the Salesforce Data event. This event gives users the
ability to select a wide range of criteria, across objects, on data that exists within the
Sales and Service Cloud platform.
\f40 SFMC Automation Tools","Within the Salesforce Community and Salesforce Campaign event types, users can select
either the campaign or community that they would like to inject and then configure the
appropriate data fields from related objects that they will use to perform some action
in the journey. In the Salesforce Data event, the configuration goes much deeper. Users
will first select some object in Sales and Service Cloud that they want Journey Builder to
evaluate for entry into the journey. It's important during this step to select the object on
which most of your entry criteria are based or one that has a strong data relationship with
all other objects used in your entry criteria. Once you've selected the object that you'd like
to serve as the basis for your journey event, you'll be required to select the appropriate ID
of the object that you will inject into the journey. This value will become the contact and
subscriber key for your records in Marketing Cloud, so it is important that you select
a value that is consistent with your other Marketing Cloud journeys and sending processes
in order to prevent duplicate contact records or other impacts on your sending
and reporting.
Next, we'll need to let Journey Builder know what event type it needs to listen for and the
criteria that should drive the journey injection. Records can be evaluated when they are
either created or updated and, while optional, specific criteria on who should be evaluated
based on field values in the primary object record can also be included to further filter
down the intended audience. After this has been configured, users also have the option
to filter the journey entry audience by object field values that are related to the primary
object selected for this event. This is separate from the previous step in which only values
on that specific object record are available for defining further filters.
Finally, after users either skip or configure the filtering criteria widget, there is the ability
to add any relevant data from related objects to inject as journey data whenever a record
qualifies for the journey.
While there are other journey event types present in Marketing Cloud that may provide
a more applicable solution for specific use cases, the three listed previously should
encompass most of the scenarios that a business would encounter in order to inject
customers into their journey lifecycle. With this knowledge in hand, and potential
customers ready to inject into our journey, let's take a look at the actual journey building
blocks you can use to reach customers.
\f Journey Builder overview 41","Journey Builder activities","While understanding how to inject contacts into our journeys is a critical component","of developing within Journey Builder, what good is a journey if we don't have activities","configured to interact with customers or automate business needs? An activity in Journey","Builder is a configuration on the journey canvas that correlates to some specific function","able to be applied to contacts within a journey. In Transactional Send and Single Send","journeys, the type of activities that the journeys can contain is restricted only to those","related to messaging and they cannot be deleted or removed from the canvas when","selecting your messaging type. Multi-Step journeys, however, allow users to add and","configure any collection of available activities for their account. This provides an excellent","method of creating custom journey flows that meet specific business use cases and create","additional value.","Activities within Journey Builder are grouped by their function type in the UI in order","to generate a logical pairing for end users. These groups are classified as Messages,","Advertising, Flow Control, Customer Updates, Sales & Service Cloud, and Custom.","Let's take a look at each group in turn in order to gain a bit more insight into","their functionality.","Messaging
Unsurprisingly, messaging activities are a collection that focuses on capabilities for
sending both transactional and commercial content to customers across channels. These
channels include email, SMS, MobilePush, LINE, and in-app messaging (among others).
These activities can be configured to action any user data within Marketing Cloud in
order to deliver personalized content to contacts at any stage within the journey lifecycle.","Advertising
Advertising activities allow you to automatically configure and deploy advertising
campaigns across various sites and channels. Users can incorporate effective Facebook
ad campaigns, adjusting for the audience and ad spend, to directly market to customers
in the channel that they prefer. While this activity does require Advertising Studio,
it's certainly a powerful form of communication for customers who may eschew
traditional messaging channels for a more social environment.
\f42 SFMC Automation Tools","Flow control
Flow Control activities are those that allow custom routing or segmentation directly
within the Journey Builder UI. Activities such as decision and engagement splits are
critical for creating journeys that can adjust the given path for a contact based on their
data or engagement behavior. For instance, perhaps we want our loyal customers to
receive a different experience in the journey, or perhaps even a different set of activities.
By utilizing splits, we can segment out our loyal customers to provide them with a more
personalized journey flow that can lead to higher engagement and brand appreciation.
Other examples of journey splits are random splits, which segment out contacts randomly
into weighted percentages for each path, and Path Optimizer, which can allow users to
conduct A/B/N tests on up to 10 variables on engagement that automatically encourage
contacts to enter more highly-performing journey flows.
In addition to journey splits, wait activities are a key component of Flow Control and
journey workflows generally. These activities essentially hold a contact in place until some
given event or schedule has been completed. These can range from hardcoded date times
to activities that wait for an API event to be triggered before contacts proceed with the
remainder of the flow. When constructing multi-day journeys, or those where a special
event requires exact communication touchpoints, wait activities are the preferred method
of ensuring that customers reach your other activities exactly when they are meant to.
Finally, we have Einstein activities. These activities use aggregate, personalized data along
with machine learning to allow users to target their customers by preference, affinity
for the brand, and engagement history. With the Einstein Scoring and Frequency splits,
marketers are able to send customers down varying paths within a journey based on
metrics such as their personalized persona due to likelihood of engagement or their
current email saturation status derived from their historical and current data trends. The
Einstein Send Time Optimization (STO) activity is also a common implementation
in Journey Builder and uses over 90 days of email or MobilePush data, and over 20
individual weighting factors, in order to generate an ideal send-time at the contact level.
This greatly enhances the opportunities for marketers to anticipate customer interactions
and deliver messages at a time that works for the individual customer.","Customer updates
This group actually consists of only a single activity, called Update Contact. The Update
Contact activity allows the user to update or modify any sendable data extension within
Marketing Cloud when a user enters this step. After selecting the desired data extension,
any field within it can be updated with a static value from the journey, making it ideal for
logging and reporting purposes, among others.
\f Journey Builder overview 43","Sales and Service Cloud","The Sales and Service Cloud activities group allows for the functionality to create","or update Sales and Service Cloud object records from either journey or contact data. This","allows for the automation of several items within the platform, such as converting leads,","closing, creating, or updating tasks and requests, creating journey logging records on","custom objects, and many other scenarios that could be used to meet business use cases.","For organizations that use the core Salesforce platform, having this functionality within","Journey Builder can completely transform their marketing implementations and is one of","the strongest use cases for Journey Builder.","Custom
Finally, we have custom activities within Journey Builder. Unlike the previous groups,
there is no pre-configured functionality or activity type that the user can automatically
drag onto the journey canvas and utilize. Rather, this activity allows developers to extend
the functionality of a journey by providing a platform of completely custom integrations
and activities that are only limited by the external implementation itself. We will cover
these more extensively in a later chapter, but their ability to extend Journey Builder to
meet any use case makes them a powerful tool in the marketing toolkit.
Now that we've covered Journey Builder activities and the entry events that drive them,
we're ready to start revolutionizing our marketing approach to customers, right? Well, not
so fast. Before we can fully utilize the power and customization of Journey Builder,
we need to be aware of some of the limitations and considerations when building
a customer journey.","Journey Builder considerations and best practices","There is much to consider when using Journey Builder. Just the range of capabilities and","usage alone creates subsets of subsets of considerations and best practices. There are","a few major considerations that need to be kept in mind about Journey Builder that are","universal across each usage though. These considerations, combined with some basic best","practices, will be listed in the following section.","Considerations
Although, as stated, there are hundreds if not thousands of considerations for Journey
Builder, I wanted to concentrate on just a few major ones that are considerations in almost
every single usage of Journey Builder.
\f44 SFMC Automation Tools","Throughput
Journey Builder was built upon the trigger send definitions that existed previously in
Marketing Cloud. This is great as it's a strong way to give 1:1 messaging to your audience.
The drawback is that it does not allow for batching, so the greater the volume pushed
in at once or at high frequency, the larger the queue and time delay until sending is.
Journey Builder has been increasing its capabilities in this area over the last couple of
years. It initially was absolutely terrible and near worthless for any sort of volume. With
the introduction and push towards Single Send journeys, they increased the capabilities
and got it to a point where it was strong enough for a good portion of email sends. That
being said, if you plan to send emails out in the millions on a regular basis, expect major
delays in email sends, spanning hours of difference. This can be a deal-breaker for some in
utilizing Journey Builder for these campaigns.","Scripting capabilities
Journey Builder does not offer a way for significant customization or scripting. There are
no SSJS Script or SQL Query activity types that can be used. This limits Journey Builder
in some ways and can wind up making it a less desirable tool for your automation. This is
not to say that there are not any custom capabilities, the Custom Journey activity offers up
a ton of capability and possibility that can help get you to where you need to be. The major
issue with Custom Journey activities though is that they need to be hosted externally,
are written in JavaScript, and require strong developer knowledge to implement. With
these being hosted externally, it reduces the process draw on Marketing Cloud, keeping
the system from being overwhelmed by the frequent runs of the same script for each
subscriber that passes through.","Tracking and analytics","Journey Builder does offer very beautiful dashboards and good basic tactical tracking and","analytics data. It even offers information including goals, exits along with interaction,","and performance-based metrics. What it does not offer though is logging capabilities","or custom reporting. These are not necessary for every journey and may not be relevant","for some usages, but this can be a major consideration on some complex or highly","visible journeys that you need custom KPI tracking or monitoring of where people","are in journeys, and so on. There are ways to implement these capabilities, but that","implementation would mostly be through Automation Studio and utilization of some","features of Journey Builder in ways they were not intended to be used, which requires","a fairly high level of technical and development skill.","\f Automation Studio overview 45","Best practices
Now that we have looked at the considerations, I want to give a few best practice tips for
the utilization of Journey Builder. These are broad and not specific because, similar to
considerations, there are hundreds of different best practices depending on usage
and context.","Journey logging
Although, as stated earlier, this is not a native feature nor one that's easy to implement,
it is highly recommended to implement it everywhere you can. Journey logging is
something that if you do not have it and later need it, there is not really a good way to
get that information after the fact. So, I always go by the philosophy I would rather have
something I do not need than have a need and not have the information.","Pre-planning and heavy consideration","This is the bit that I see so many people fail on. Before implementing any journey, you","should have a significant amount of time prior that is dedicated to just taking into account","future needs and integrations to ensure what you set up will be able to be continually","running for an extended period of time with few updates or edits. The more you edit and","create new additions on top of your original journey, the more you create a risk of failure","for the journey as well as the potential to unintentionally skew your historical data. The","more consistent you can make your journey, the higher quality the output is.","Journey monitoring
Although there are some built-in features or extra services to help with this process,
these are far from comprehensive. I always recommend building an automation that
looks at collecting the status, sent, opens, clicks, and queues for each of the email sends
inside of your journeys to ensure these are running as expected. I also highly recommend
keeping track of entry and exit numbers to ensure these are as expected and are filtering
as expected. That is it for the overview of Journey Builder. Now, we will move on to an
overview of the other major automation tool in Marketing Cloud, Automation Studio.","Automation Studio overview","The general principle behind automation is to reduce manual actions and create efficiency","within your processes and environment. Automation Studio therefore can interact and","affect almost every part of Salesforce Marketing Cloud in some way, shape, or form.","So, with that being said, what exactly is Automation Studio?","\f46 SFMC Automation Tools","Automation Studio is a batch-driven performance enhancement platform. It allows","you to utilize custom activities and actions in a highly performant and efficient manner.","Automation Studio is very different from the 1:1 interaction focus of Journey Builder.","But, an important thing to note is that different does not mean worse. They are both very","powerful in what they do and are very complementary to each other.","This focus on bulk processing allows for more volume to be accomplished in a single run,","which can help reduce the processing required and increase performance speeds, which","reduces the total runtimes of actions. As a note though, bulk processing is not the only","capability of Automation Studio, it also allows for server-side scripting and other types of","integrations through automated exports/imports and similar custom solutions.","Inside of Automation Studio, there are two different types of automation triggers available","– Scheduled Automations and File Drop Automations. The automations are separated","based on the trigger type that is going to be used to initiate the Automation activities.","These two types are very complementary to each other and cover almost every possible","scenario you will face. These types involve a recurring, predictable option as well as","a more real-time reactionary option.","Scheduled automations
Scheduled automations are usually the type of automation that first pops into your mind
when you think automation, using a recurring trigger type. Scheduled automations are
repetitive tasks or sets of tasks that occur at specific times and/or recurrences while doing
specific actions and activities. It is the bread-and-butter automation type and, honestly,
most of the automations you work with will fall into this type.
These automations work off a trigger based on a specified schedule or recurrence where
you can set them to run from yearly down to hourly, you can control the time period
they run for, as well as the total number of times they can run. This trigger is a very clean,
efficient, and predictable automation. By utilizing this, you can plan other activities and
actions around these automations as you know the precise time it will run and what
it will do.
A great benefit to the predictability and clean aspects is that this type is the most
non-technical, resource-friendly automation type. These automations tend to be more
intuitive and easier to pick up, understand, and use. For instance, the user interface wizard
inside the system is very descriptive and it's easy to figure out the usage and how to set up
each aspect. It is a drag-and-drop interface as well to help reduce needed user inputs
or actions.
\f Automation Studio overview 47","Scheduled automation user interface","The main part that makes a scheduled automation unique is the setup or user interface.","This offers many options and filters. The following screenshot is an example of this","setup screen:"," Figure 2.7 – An example of the setup screen when creating a scheduled automation
As you can see from the preceding screenshot, there are a lot of options that can be used
to set up your scheduled automation. The schedule options available are very easy to
set. These are fairly self-explanatory in terms of how they work and what you need to
implement or edit them. All of the available options are contained in dropdowns or date
pickers that help you select what recurrence (repeat) type you want (if you want any
recurrences) and then allow you to select the numeric value of that type. For example,
you can choose Hourly, and then set the number to 2, which means the automation will
repeat every 2 hours. You can then also set an ending via Date, Count, or No End Date.
After you have set the schedule and saved it to the automation, this opens up quite a few
other cool features:"," • Two manual Run Once options:"," One on the initial dashboard."," Another on the individual automation's canvas.","\f48 SFMC Automation Tools"," • Active/Paused options for your schedule:"," Active – The schedule is running and recurring as dictated."," Paused – The schedule is not active so the automation is not running."," • Easy Editing, Creating, Deleting, and Updating (you need to pause to make edits):"," Editable Schedule
Automation Activities
Automation Name
Automation Properties, and more","Use cases
So now we know all the cool things about scheduled automations, why would we want to
use them? The majority of use cases for scheduled automations are around data imports
and manipulation or bulk campaign message sends, but that is certainly not all that they
do! The following are a few examples of uses, but there are far too many possibilities for
me to list them all here.
A good, solid example of a common use case for scheduled automations is to filter
or query a master data extension to create a segmented sendable audience and then send
an email for an annual anniversary, birthday, or reminder to take action.
Another common use case is for custom activities or programs where you use things
such as Script activities or SQL Query activities to create custom solutions, reports,
or capabilities. For instance, to use SQL queries on a schedule to create a Journey Builder
log based on the corresponding data views and intermittent Update Contact activities
inside the journey.
My final example of a scheduled automation use case is that it is very handy for creating
custom reports and tracking/analytics data. By using SQL queries, data or tracking
extracts, scripts, and similar activities, you can create and combine any sort of subscriber
level and contextual data to fit your needs. An example of this is a report on your journey
email sends that shows opens, clicks, bounces, and so on of each email send job.
\f Automation Studio overview 49","Considerations for scheduled automations","Along with all the good, there are a few limits or considerations that should be taken into","account when dealing with scheduled automations. As with all things, there is contextual","and quirky behavior that you will need to learn about through usage and research as","it is not officially documented anywhere. In the following sections are a couple of caveats","and benefits I have found over the years and wish that someone had told me when","I first started.","No queueing
Scheduled automations do not queue, so if you start the automation while another one
is still running, the subsequent run will force the previous run to stop and the next
automation will start or it will just error out/skip. The user interface has some security
validations in place around Run Once options to help prevent this, but there are still a few
instances where this can be an issue. For instance, if you schedule the automation to run
every hour, and the automation takes over an hour to run, then you will never get the part
of the automation beyond the 1-hour runtime to run as the next scheduled trigger will
force it to stop and start over or it will miss every other run due to the overlap.","No recurrence below hourly","As stated, Marketing Cloud and Automation Studio are not intended to be super-","powered, highly capable ETL or automation tools, but instead, they are a messaging and","communication platform that is focused on the enablement of custom capabilities and","actions. This means that a lot of the higher-powered or process-heavy capabilities you can","see on other tools are not here. One of those is the ability to have an automation with","a high frequency (more than once an hour), continuous automation, or a listener. I defend","this because the number of processes and amount of server power to enable this on top","of everything else would sky-rocket the price for a tool that is already a big investment","as well as potentially making the tool less intuitive and much more complex.","Skip next occurrence","This is a very under-utilized and underrated ability. Through this button, you can","automatically skip the next occurrence without needing to pause or stop the automation.","This can be very helpful at times when you need to make a quick change to something and","don't want to have to remember to come back and turn the automation back on after your","changes are made. The number of times I have forgotten to switch an automation back to","Active after pausing it for changes is staggering over my career and has led me to be a very","nervous person about ever pausing any existing and running automations.","\f50 SFMC Automation Tools","Bulk sending throughput","One of the major benefits of using an automation over a journey for bulk sends is the","throughput available. What can take Automation Studio 20 minutes could take Journey","Builder 2 hours or more. As time has gone on, this gap has shortened, but it is still","something to consider. There are quite a few other considerations to determine which is","the right tool to use to send your email, but this is one I like to make sure is mentioned as","it can have a significant impact if not recognized.","Snapshot of email at scheduling","The biggest benefit I have seen of using Automation Studio scheduled automations","is the ability to take those bulk emails you have been scheduling out via the wizard in","Email Studio and instead make a Send Email activity (also called user-initiated email)","connecting the audience, properties, and email assets into a single object. You then pop","that into an automation and schedule the automation.","How is this different? Well, when you directly schedule the email via the wizard, the email","is preprocessed and a snapshot of all the content and scripting is made. Because of this,","if you make any changes to the email after this but prior to sending, then those changes","will not be reflected inside the send. You would need to cancel the scheduled email and","reschedule it. If you use an Email activity inside of a scheduled automation, it will not","snapshot the email until the Email activity is initiated inside the automation (at send","time), allowing all content and scripting changes to be included.","File drop automations","File drop automations were previously called Triggered Automations because they are","triggered when a file (meeting certain criteria) is dropped onto the SFMC SFTP. The neat","thing about this type of automation is that it can be in near real time (reliant on your","data drop timing) and that the runs of this can be queued up to run after the initial run","is completed.","File drop automations have a lot of unique and great features and capabilities that","separate them out from scheduled automations. File drop automations not only allow for","immediate real-time running capabilities but also offer flexibility and versatility on file","naming convention and the usage of substitution strings are not available anywhere else.","As a note though, there are a few considerations and requirements that need to be taken","into account when utilizing a file drop automation. They will help you to optimize your","usage and make sure you create the most efficient and powerful solution for your needs.","\f Automation Studio overview 51","Salesforce Marketing Cloud SFTP","To utilize the file drop automation, you will need to have access to the Marketing Cloud","Enhanced FTP, which is usually enabled by default in all new accounts, but you may","need to talk with your account executive if you do not see it available in your enterprise.","Although this is showing FTP in the name of the business rule, it is in fact an SFTP,","providing that added level of security.","The reason you need this enabled and accessible is that file drop automations will only","register file drops on the Marketing Cloud SFTP and not on any third-party FTP services.","Even though you can use things such as imports and exports and file transfers to multiple","third-party locations via file locations, this is not the case for file drop.","Important considerations and tips","As with everything, there are certain contextual considerations and thoughtful approaches","around the tool. They need to be weighed up when you are planning and building your","strategy for your automation usage and structure. Most of them are not limitations","or drawbacks but more along the lines of ways to optimize your usage of the platform","and ensure it is future-facing.","Default directories
It is not recommended to use the default Import or Export folders for this activity,
but to instead create custom subdirectories. This is because the more listeners there are on
a folder, or as they are also called, directories, the more complex they are and there is
a higher risk of something breaking, failing, or otherwise working incorrectly.
By utilizing sub-folders instead, you are able to separate out and organize your triggers
into separate locations, keeping them more focused and efficient. These folders are very
easy to create and work with through any FTP tools, such as FileZilla, to access your
Salesforce Marketing Cloud SFTP and interact with it.","Multiple file drop listener events","Although with a naming convention defined, you can put multiple listener events in","a single folder, only one automation can run per file. So even if your file meets all the","criteria for each of the listener events, only the first one will be run; they cannot be","triggered from the same file.","\f52 SFMC Automation Tools","Run Once
Inside of scheduled automations, there is the option to use Run Once on an automation
to manually test it in its entirety or to test parts of it. Inside of file drop automations, there
is no Run Once option nor any equivalent method. It instead uses Active and Inactive.
To test an automation, you would need to drop a file while it is Active and run it the same
as if it were live action.","Use cases for file drop automations","Now, the use case for file drop automations usually stems from the immediacy and","real-time capabilities they have. But this is not the only reason to use this type of","automation. The file drop automation can also be used to run something at a rate more","often than the scheduled automation limit of 1-hour recurrence. It can also queue up","files, so that even if the automation takes 2 hours to run, if you drop files prior to that","completing, it will not disrupt the running instance, but instead queue up the new one to","run after that instance is complete.","A sample scenario for a use case of file drop automation is when you need to receive","and import a compressed file upon drop so that SFMC is using the most up-to-date","information that is possible via a file drop integration with another data platform.","Through the file drop, you would be able to trigger it once the SFMC SFTP receives","the file, which would then go through the correct Activities, such as File Transfer and","Import, and so on to correctly manipulate the file and get it imported to the right place.","Another sample scenario for a use case of file drop automation is if you have","a time-sensitive reminder or notice email that needs to go out once you drop the sendable","data onto the SFMC SFTP. Basically, you would have the file drop with the correct settings","in place. This would initiate the automation, which would then take that data and import","it to the correct data extension and then run a user-initiated send that uses that data","extension (DE) as the sendable audience to send out the email once the data is done","being loaded in.","Your file drop setup options","The file drop automation is focused solely on a file drop. This trigger can be honed","or as broad as you like. Through different options such as selecting specific folders and","naming conventions, you can make sure it will only run based on the exact file drop you","want it to. This can be separated into two sections:"," • FileName Pattern: You can set it to load specific filename patterns inside of a folder.
• No FileName Pattern: You set it to run on every file loaded into a specific folder.
\f Automation Studio overview 53","Here is an example of what the File Drop Setup screen looks like:"," Figure 2.8 – An example showing the setup window for a file drop automation
As you can see from the preceding screenshot, there is a clear selection that can be made
from the options (No Filename Pattern or Use Filename Pattern) as well as a folder and
subfolder visual pathway that you can use to select specific directory locations in your
SFTP to target.","Filename patterns
Filename patterns are used to set the filename filter or naming convention to be used
to decide which files dropped to the corresponding folder will trigger your automation.
Files that meet the filename requirements (case insensitive) are processed as Automation
Studio parses them. This means that each file is loaded and read within the context of
Automation Studio's file parser."," Note
The automation engine will not include any file extensions (anything after the
dot, ., in the filename) so do not include this in your filter or it will fail.
\f54 SFMC Automation Tools","Operators available for filename patterns","To allow for versatility and to account for nearly every possible need when working with","dynamically named files, the file drop automation filename pattern allows you to choose","between three operators. Each of these three should cover your needs and let you make as","complex or simple a solution as you need:"," • Contains – Similar to LIKE in SQL and just looks to see if those characters exist in
that order in any file (for example, LIKE '%FNP%').
• Begins With – Similar to SQL's LIKE with no wildcard at the front (for example,
LIKE 'FNP%').
• Ends With – The reverse of Begins With. SQL's LIKE with no wildcard at the end
(for example, LIKE '%FNP').","Here is a screenshot of what is displayed on the platform for reference."," Figure 2.9 – Filename pattern options"," Note on Wildcards in Filename Patterns"," Wildcards that are available inside of the Import activities and File Transfer"," activities are not available inside of the file drop filenaming patterns. The"," field for file drop automations is a literal string, meaning that if you type in"," myfile_%%month%%, it will look for literally myfile_%%month%% and"," not myfile_01 as it would inside the import of file transfer activities."," There are no wildcards (that I am aware of) that work in this field, so you"," would need to account for this when setting up your naming convention. For"," example, if your files begin with YYYYMMDD_myfile, then you would need"," to do an Ends With _myfile as the variable part is not definable inside the"," file drop automation.","\f Automation Studio overview 55","Now that we know how to create filenaming patterns, what happens when multiple files
match the criteria?","Multiple filename patterns in the same folder","I wanted to make a note on what will happen if you have two filename patterns in the","same folder and the file drops match both patterns:"," • Only a single automation will be triggered, not both."," • The filename pattern that is matched first is the one that will be triggered.","For example, say we have two filenaming patterns, CONTAINS Bootcamp and
CONTAINS OrderDetails. These are both in the same folder. If we drop a file named
Bootcamp_OrderDetails.csv, what would happen?
The answer is that the automation with the filename pattern of CONTAINS Bootcamp
would be triggered and the other (CONTAINS OrderDetails) would be ignored
despite matching on the filename because Bootcamp comes before OrderDetails.","No filename pattern
If you choose to utilize the option to forgo a filename pattern, it basically creates a very
broad stroke of possibilities to trigger the automation. To help contain this, each folder
can only have one file drop associated with it that does not have a filename pattern
assigned. By locking the folder for other listener events, you prevent issues like the ones
we discussed when talking about multiple filename matches on the same folder.
This also means that any and every legitimate file dropped into this folder will cause
your automation to run. Whether it is blank.txt or myRealFile.csv, it will trigger
the listener and begin running your automation. Depending on the purpose of the
automation, this can cause errors or the potential for unintended runs, and so on.
You will need to account for this in your setup and planning.","Benefits of file drop automations","When you mention file drop automation, there are a couple of big features that need to be","considered outside of just the real-time capabilities. For instance, file drop automations","can offer queueing and special substitution strings that are not available anywhere else.","We'll explore these capabilities next.","\f56 SFMC Automation Tools","File queueing
File queueing is set up to verify if you want to create a queue of multiple triggers
if a second file is dropped before the first run is completed. You can also turn this off
if you do not want the triggers to queue:"," • If you queue your runs, even if the previous run fails, it will still go through the
queue until it is finished.
• The queue will run through the triggers in the order they were triggered, but it will
use the most recent files.
• So, if you overwrite a file in a file drop for a later queued trigger, it will not use the
original data, but the new data.
• If you turn off queueing, any triggers that happen during a currently running
instance will cause that instance to stop and restart based on the new file trigger.","File drop substitution strings","Inside file drop automations, there are custom substitution strings (similar to","personalization strings) that can be used in File Transfer activities or Import activities.","These strings allow you to reuse the name of the file that triggered the automation","(via filename pattern or folder drop) in your activities in subsequent steps inside the","automation as well as the option for date/time substitution strings. This should help","ensure that your activities inside this automation type utilize exactly the right file.","Overall features and capabilities of Automation Studio","One of the main features of Automation Studio is its data management capabilities.","This includes Extract, Transform, Load (ETL) processes, such as segmentation","or data manipulation, and its capabilities around relational database capabilities and","object interaction.","One of the major uses of Automation Studio relates to its data manipulation and relational","database capabilities. This can include the manipulation of sendable or relational data","as well as the creation of custom reporting or tracking. You can also utilize Automation","Studio for custom solutioning using Script activities and many of the other utility-type","activities available.","\f Automation Studio overview 57","ETL processes
ETL is the commonly used operation of extracting data from one or more sources and
transferring it to load into a destination system. This destination system would then
represent the data differently from the original source(s) or inside of a different context.
ETL also makes it possible to migrate data between a variety of sources, destinations, and
analysis tools. This makes it a critical aspect in producing business intelligence and in the
execution of broader data management strategies.
ETL allows you to do things such as segmenting your data in-platform from Filter
activities, SQL Query activities, or even in Script activities. This can allow you to shape
your audiences in ways that allow you to minutely target and effectively communicate
your messaging. It also allows you to manipulate and form data for analytics
and reporting.
That being said, SFMC is not designed to be an ETL system and is not to be utilized for
major or frequent ETL needs. It is optimized for quick segmentation needs and more
tactical-type manipulations, but once you get to transforming or otherwise interacting
with large volumes of data or high-frequency interactions, it tends to fall flat. To this
extent, there are limitations that need to be considered for these types of activities within
the platform and you should consider doing these actions prior to bringing them into
SFMC or inside of an external tool that is designed for these actions. This will make your
process much more stable and efficient.","Relational database capabilities and object interaction","Through actions such as the ETL capabilities described previously in SQL Query activities","or Script activities, you can take data that are related via a foreign key and combine","or otherwise interact with it. This can allow you to form a master marketing database if","you want that pulls all relevant data into a single location to be used as sendable data. You","would just utilize a SQL Query activity that joins the data based on these keys and then","outputs the resulting data into your targeted marketing database data extension.","Automation Studio also allows you to manipulate and form custom relational objects","that are completely separate from all other data in the platform. These are called data","extensions. Now, through scripts, you can do a ton of things related to data extensions","– including creating one, filling in the rows of a data extension, manipulating/adding/","deleting fields, and even copying or deleting. This can give you full access to automate","container management in your enterprise. You can then link these relational objects for","future needs via scripting and lookups based on the foreign keys.","\f58 SFMC Automation Tools","What the heck is a foreign key, you may ask? I know what a primary key is, one of the
unique identifiers for an object to define, add, or update capabilities, and like the key to
my house – but a foreign key (shrug)? A foreign key is a field or column that provides
a link between the data in each relational database table or object. This is what you match
or join on, to utilize a SQL reference, to ensure that data is connected in the correct way.","Custom reporting and tracking capabilities","Using a lot of the previously mentioned aspects, not only are you able to segment and","prepare or massage your data prior to sending, but you are also able to do all this","post-sending to create your own tracking and reporting objects.","By manipulating built-in objects, such as data views, you are able to get interaction and","tracking data and utilize it inside of SQL Query activities to build your own reports based","on your custom data, integrated and infused with the platform tracking information.","You can also utilize Data Extract activities and File Transfer activities to do bulk drop","integrations to an analytics platform. You also can utilize the APIs and scripting languages","to build your own dashboards, reporting platforms, or other displays as well as further","integrations of Marketing Cloud into other systems.","Example custom solutions and capabilities","Outside of the data capabilities, there are a ton of other actions and activities that can","be handled or built inside of Automation Studio. This is honestly my favorite aspect of","Automation Studio – its ability to run custom solutions and actions. Now, the possibilities","of these solutions and actions are much too far-reaching for me to list out here, but the","following are some of my favorite aspects:"," • Script activities that fully build out your email campaign needs (data extension
creation, segmentation, suppression lists, emails, and user-initiated email sends
(Send Email activity)) and then build and schedule the automation to run it
• Script activities that provide helpers such as an automation dashboard or data
extension inventory report
• The ability to create, delete, update, schedule, pause, and stop other automations
• The ability to pre-assign coupon codes for campaigns
• Fire Journey Builder events or journey audiences
• Automating multi-channel sends
• Refreshing filtered lists and data extensions including mobile lists","Now that we have a strong understanding of Automation Studio and Journey Builder,
let's dig into how they affect multi-channel benefits.
\f Automation Studio overview 59","Automation Studio activities","Let's take a quick look at the guts of Automation Studio and get a quick overview","of some of the more prevalent activities available in it. These puzzle pieces are what","we lock together, like a jigsaw puzzle, to form the beautiful image that is our","completed automation.","Each one of these puzzle pieces is extremely powerful in its own right and deserves to","have a detailed explanation, but for now, we are just going to give a brief overview of each","and then go into details as we dive further into the book. Heck, I could probably sit here","and write an entire book about the nuances of every one of these activities and the pros/","cons and caveats/features of each.","Popular activities
The following is a list of all the popular activities utilized in Automation Studio. These are
not all of the activities as there are a ton, and some of them I joined together into groups
as they all do similar or related actions:"," • Send Email activities: Activities that execute a user-initiated email"," or Salesforce send"," • SQL Query activity: Runs an SFMC SQL Query"," • Import File activity: Imports a file into a data extension or list"," • File Transfer activity: Manipulates a file in a defined file location"," • Script activity: Runs an SSJS Script activity"," • Filter activity: Executes a Filter Definition (lists and data extensions)"," • Data Extract activity: Extracts data from multiple sources"," • Journey Builder activities: Fires Events and Journey Audience"," • Refresh (Group/Mobile List) activities: Refreshes a list or group that already exists"," • SMS/GroupConnect/Push activities: Multi-channel options for messaging"," • Utility activities: Activities such as Verification and Wait","This is a good general description of what activities are available in Automation Studio.
These activities are instrumental in the strength and capabilities of Marketing Cloud
Automation Studio. Later in the book, when we explore in-platform automation
capabilities, we will learn more about how some of these activities can further enable
automation capabilities. We now have a strong understanding of both Journey Builder
and Automation Studio separately, but what about if we were to compare them?
\f60 SFMC Automation Tools","Comparing Automation Studio and","Journey Builder","Now we know what both tools are and a good amount of detail on each of them, it's time","to compare and contrast them. We know both tools are super-powerful and effective, but","each is different and unique. There are strengths and weaknesses of each, as well as fun","little nuances and unexpected behaviors.","This is true not just in general capability, but also in things such as planning, strategy,","execution, tracking, and more! How you build a program or journey can have a great","effect on the output, including things such as throughput, efficiency, integration,","and capabilities.","Let's dive into these differences and see what we can find.","The key differences","There are quite a few different discussions that can be had about the differences between","the two tools. These differences do not necessarily mean weaknesses nor are otherwise","negative. In many of these cases, there is no clear-cut winner, they just are different!","Which one works best or best suits your needs fully depends on what your context and","your goals are.","The following are a few of the key differences that I have noticed over the years dealing","with both tools. Please do note that although the following key differences are definitively","accurate, they are accurate as a general rule and not specific to your situation and context.","With that factored in, there is always the potential that it may be different in some way.","So please, always do your own testing and research prior to making any final decisions.","Multi-channel mayhem
While both Journey Builder and Automation Studio offer multi-channel ways to send
marketing messages, they each do so in different ways and at different levels. It is certainly
not fair to directly compare them – think apples to oranges – but we can show the
differences in usage, philosophy, capability, and so on that are inherent in each.
\f Comparing Automation Studio and Journey Builder 61","Messaging strategy
Journey Builder provides a stronger messaging strategy benefit due to its 1:1 messaging
focus. As each person will usually enter the journey individually, compared to large
batches in Automation Studio, it enables you to better view and act on results and
behavior in real time, instead of needing to wait for the bulk action to conclude. This
includes Journey Builder providing more possibilities to test different strategies and
optimize your messages based on engagement and interaction.","Functionality and usability","Automation Studio is focused on bulk activities and ETL. Although Automation Studio","is the more powerful tool, Journey Builder is much more marketer friendly with its","drag-and-drop interface and functionality, so it requires much less technical knowledge","and know-how to utilize. In Journey Builder, marketers can use the drag-and-drop","functionality to pull in audience decision splits, create customized wait times, and","configure firing events that allow customers to be injected from almost any data source.","Segmentation stipulations
In Automation Studio, data filters and queries, using SQL, are built and leveraged to
assist in segmentation needs, which can be much more confusing. For instance, building
segmentation and A/B testing in Journey Builder would require dragging and dropping
a split, such as a decision split, and then putting in the logic. To do this in Automation
Studio, you would need to utilize SQL Query activities, which requires SQL language
knowledge, among other activities, such as filters, user-initiated sends, and more in
a much less intuitive way.
The user-friendly approach to segmentation and filtering is great for those who are not
expecting large-volume or high-frequency interactions on the journey. The splits, filters,
and segmentation in Journey Builder are limited and will greatly slow down the journey
as the filter runs every single time on every single record. So, for instance, if you do a bulk
import of 10,000 records to your journey, your segmentation will run 10,000 times, once
for each record – whereas in Automation Studio, the SQL query or filter would only run
once and affect all 10,000 records.
\f62 SFMC Automation Tools","Throughput and volume","Automation Studio offers a level of throughput and processing power that is well above","and beyond what is available in Journey Builder. With this throughput and power, you can","send millions of records in much less time than you could in Journey Builder. This can","help ensure your audiences get the messaging you want within the timeframe you want to","send it to them. Another thing of note is that the more complex a journey gets, the more","this affects throughput and processing locally in the journey.","Is mimicry really the best compliment?","Journey Builder is built on triggered email messages, which is great, but for the most part,","this means that what you build in Journey Builder you can mimic in Automation Studio.","So, with a lot more technical effort and planning/setup, you can build a better-performing","version of pretty much any journey. Now, that comes with the caveat that Automation","Studio would likely need a lot of duct tape and bubble gum coding and setup to fully","mimic it, so the value of the performance is not likely to be worth the level of effort and","risk to build it in Automation Studio.","Reports and tracking","Lastly, is a comparison of reporting and analytics capabilities. Inside Journey Builder,","you have a more real-time view of your analytics and tracking. This includes built-in","goal reporting that does not exist in Automation Studio. This, combined with real-time","tracking, such as what you see in triggered sends, gives insights in a very different and","more agile way than what you get in Automation Studio.","That being said though, there is not really much you can do on custom reporting,","automated exporting of your tracking data, or deep dives and analysis through","micro-analysis. Journey Builder is very much just an out-of-the-box tool when it comes","to reporting. The closest it can provide to allowing for custom reporting is the ability to","update contact or send information to Sales or Service Cloud.","Automation Studio, on the other hand, provides all kinds of possibilities for custom","reporting, extracts of data, and deep dives into data. Through SQL Query activities,","you can build your own datasets and manipulate, join, or otherwise transform them to","the form you want them to be. From there, you can even either put all this into an email","or utilize it on a page and distribute that link via an email, all inside a single automation.","There is the ability to also utilize tracking extracts to pull all the interaction data,","subscriber data, and so on from Marketing Cloud and push it to an FTP to be","digested externally.","\f Comparing Automation Studio and Journey Builder 63","Which one should I use?","This is probably the question I hear the most whenever I talk to someone new to the","platform. The question sounds simple enough and therefore should have a very simple","answer, right? Well, kind of. The answer is: it depends.","The answer to this question is based completely on the context, personal preference, skill","level, use case, business goals, environment, and integration considerations. With all these","factors and more, finding which one is the best for your use case is a bit of an adventure in","and of itself.","Now, in some cases, the answer is not really all that important and you can just go","with personal preference and be done with it, but at other times, this decision can have","a significant impact on your future capabilities and overall efficiency. To this extent,","I always highly recommend putting in some cursory analysis prior to each build to see","if you can find evidence that you need to do a deeper-dive pro and con list.","For that cursory analysis, I have a few questions I always recommend upfront and feel","they are relevant regardless of context:"," • Will your audience require real-time interaction?"," • Is your audience entering your process via bulk imports or individually?"," • Will the person building and managing this have technical knowledge or skill?"," • Will this need to interact with Sales and Service Cloud?"," • Is there going to be some complex multi-channel marketing?","From these questions, you can build a very basic analysis of which tool would be the
better option. I highly recommend adding your own contextual questions to the preceding
list as these are just very basic questions and although they're fairly simple, they do not
always have simple answers.","With our powers combined","Sure, there can be times where you have to make the choice of Automation Studio","or Journey Builder for your needs, but I find that more often, of late, it is instead","Automation Studio and Journey Builder that truly meet your requirements. Despite","these two having the same general purpose, this does not mean they are exclusive","or competitors of each other.","\f64 SFMC Automation Tools","The different capabilities and strategies of these two tools actually dovetail together very
well! All the weaknesses of Journey Builder are strengths in Automation Studio and vice
versa. This allows for an extremely strong process across every aspect as one tool picks up
the slack of the other through a combined effort. I wonder if there is a good analogy that
could help explain this?","The peanut butter and jelly sandwich","If you had peanut butter on your left and jelly on your right and were told to make","a sandwich, would you only choose one or the other, or would you choose both? I can","tell you that the more satisfying and delicious option is to use the combination of peanut","butter and jelly – a staple in most households with young kids.","This is not to say that utilizing each individually or with other ingredients is wrong.","You might have a peanut allergy and cannot have peanut butter. You might like Nutella","better and use that instead of one or the other, and so on. But in general, if you have","both ingredients, you would wind up with a peanut butter and jelly sandwich more often","than not.","By utilizing both, you enable yourself to provide highly targeted and highly personalized","messaging in a timely manner with efficient processing and throughput. Using the","real-time capabilities of Journey Builder helps to make you more agile in your approach,","where the bulk capabilities, powerful processing, and customization capabilities of","Automation Studio provide a strong base to make Journey Builder effective and efficient.","Automation Studio and Journey Builder interactions","Automation Studio is a great complement to Journey Builder and together they form","a very powerful toolset. Both can slide records laterally or vertically across each tool and","since they are fundamentally different in purpose, there is little overlap of capabilities that","is not easily addressed and assigned.","For example, let's take a look at a few different use cases on how Automation Studio","successfully interacts with Journey Builder.","Entry events for Journey Builder","Automation Studio is a great way to prepare and manipulate data for entry events in","Journey Builder, such as the following:"," • Utilizing SQL queries, you can do most of the segmentation and filtering in bulk
prior to the audience being uploaded to the entry source.
• You can bulk update relational data prior to entry to allow for journey data to be
a more viable option and rely less on contact data, which slows down throughput.
\f Comparing Automation Studio and Journey Builder 65","Custom actions or manipulations that are not possible","in Journey Builder","Automation Studio is a great way to supplement journeys through Script activities to","perform custom actions or manipulations on journey data that is not possible in Journey","Builder, such as the following:"," • Pre-assigning coupons
• Integrating and manipulating data for exit criteria
• Providing bulk manipulations of data for segmentation and splits
• Updating relational data for up-to-date contact data utilization","Journey reporting
Automation Studio can be an effective reporting tool on journeys:"," • SQL queries to combine tracking, custom, and subscriber data"," • Data/tracking extracts to export data for use in third-party tools"," • Script activities to build integrations or massage data for dashboards","Entry source for Journey Builder","You can utilize Automation Studio as an entry source for Journey Builder:"," • Through a few of the activities in Automation Studio, you can push records directly
into a journey by firing an event.
• You also are able to utilize Script activities to hit the REST API entry point for
Journey Builder if you need more custom conditions around it.","This enables you to have full control of the entry process and you can perform most bulk
actions prior to entering a journey as they are combined into a single program and are
not separate.
So, all in all, the answer to Should I use Journey Builder or Automation Studio? is simply
both. The best solution is usually to utilize them both to create the most efficient and
powerful solution for your needs. The power-up you get when you successfully integrate
these two tools together for your solution is wonderful. I have seen campaigns go from
fairly simple to highly personalized, dynamic, and powerful within a couple of months
just by combining the two tools instead of concentrating on just one.
\f66 SFMC Automation Tools","Summary
You should now be very aware of not only where Salesforce Marketing Cloud came from,
but its current capabilities and multiple channel usages. With the multi-channel and cross-
channel options, Marketing Cloud has grown into a titan of the market and become one of
the most popular and powerful platforms.
Along with general Marketing Cloud knowledge, you should also be well versed in the
two existing, baked-in automation tools inside of Salesforce Marketing Cloud. Both
Automation Studio and Journey Builder are very powerful in their own right, but each has
weaknesses, specializations, and considerations. These can make a huge difference in your
path forward in your automation adventure.
You can now better determine which tool fits better where, and all the caveats and
considerations you need to take into account when implementing them. It is always better
to utilize both where possible – rather than doing an or comparison, it should be an
and consideration.
In the next chapter, we will begin exploring more of the best practices in automation in
the Salesforce Marketing Cloud. This will include general automation best practices as
well as insights into best practices for each of the tools we've discussed, as well as them
both together.
\f 3
SFMC Automation
Best Practices
Now that you know the basics of automation theory and the automation capabilities that
are baked into Salesforce Marketing Cloud (SFMC), we will dive into the best practices
for utilizing and building such automation. Whether you are using Automation Studio
or Journey Builder or some combination of the two, following these practices should help
you find the most efficient solutions for your needs.
Best practice, especially in Marketing Cloud, can be a broad term; so, for this chapter,
we will be focusing on things such as performance, efficiency, scalability, contextual
decision making, and personal preferences. That said, best practice is not always the best
solution – so, take these as guidelines and not as gospel.
In this chapter, we will cover the following topics:"," • Best practices: This section will provide a quick overview of what a best practice is
and how we should apply it.
• Creating high-quality testing: In this section, we'll dive into testing, the sandbox
options in Marketing Cloud, and my recommendations on it.
• Is efficiency greater than performance?: In this section, we will dive into efficiency
and performance, as well as explore what makes a great solution.
\f68 SFMC Automation Best Practices"," • Best practice is not always best: Although best practices should be respected and
considered, they are not always the best solution.
• You (your preferences) matter: The biggest gap most people have in designing
solutions is personal context. You and your opinion are very important.","These topics will provide you with general tips and guidelines on creating, utilizing, and
planning automation inside Salesforce Marketing Cloud. Through the basics, we can delve
into testing, performance, and efficiency regarding the major considerations for automation
in Marketing Cloud. Through this and some in-depth discussions on context and personal
preference, we will be set to move forward and begin working in Marketing Cloud. Now,
let's dive a bit deeper into automation best practices in Salesforce Marketing Cloud.","Best practices
Before we get into the details around best practices, we need to become familiar with the
basics and create a base to build upon. Without a strong foundation, you will find lots of
gaps and faults in your creation and it will fail fairly easily. Think along the lines of a house
of cards on a windy day sort of situation.
So, to help set you up for success, we will begin with the basics to ensure we pass valuable
and usable information and skills across to you. When I say basic, I mean basic.
Let's begin at the very beginning. What exactly does best practice even mean?"," Best Practice
Best practice, as defined by Merriam-Webster (https://www.merriam-
webster.com/dictionary/best%20practice), is a procedure
that has been shown by research and experience to produce optimal results and
that is established or proposed as a standard suitable for widespread adoption.
Or, in other words, through trial and error, this method has shown the best
success and is the recommended approach to guide your action or usage.","Best practices focus on maintaining quality and are not a mandatory standard,
especially as they can be based on self-assessment and/or benchmarking and not just
internal dictation.
Best practice allows you to create templates or boilerplates to build off of that have proven
to be effective, reducing the level of effort required to build your actions and activities.
A great example of a best practice that is used almost universally is creating a process for all
of the data views inside Marketing Cloud to be saved to data extensions. Since the specifics,
such as date range, what time remains in Marketing Cloud, and where it is stored, vary
wildly depending on the context, please feel free to adjust these according to your context.
\f Best practices 69","Usually, an automation process is created that targets a group of data extensions you
created to mimic the data views. Then, it uses SQL Query activities to bring the data
across. In our example, I am going to have it run daily and perform an Update action
(add/update) for the corresponding data extension.
The following are some example steps:"," 1. Build out the data views with fields, data types, and the maximum characters based
on what is listed in the official documentation.
2. Build out SQL Query activities that target the corresponding data extension while
using the data view as the source.
3. Create an automation process that is scheduled to run daily early in the morning.
4. Add each of these SQL Query activities to different steps in the automation process.
5. Bonus step: Add a data extract to take a snapshot of the data view each day
for reference.","This process will allow for quicker interactions with the data views (with a maximum of
a 24-hour delay in data) as querying and filtering a data extension is much faster than
interacting with a data view. It can then be used inside of Journey Builder via attribute
groups in Contact Builder. These benefits are all amazing, but the bonus of exporting
the data views is that you can insert them into analytics platforms for even deeper and
faster dives into your data. You will also have a snapshot point of reference if you are
investigating or troubleshooting an issue.
Because of these benefits, this is considered a best practice that is universal for all who use
the platform.
So, now that we know what best practices are, we have some questions. How does
a best practice get established? What level of effort, information, or results is required to
optimize them? Does a best practice ever expire or get too old to be useful? How often
should it be investigated?","Establishing a best practice","The level of effort and work that's necessary to deem a practice as the best is rarely done","before the label is added. To this extent, we should be describing most of what we label as","best practices as smart practices or recommended practices. For the sake of understanding,","though, we will continue using best practice as the label for this book.","\f70 SFMC Automation Best Practices","To establish a best practice, you need to tackle this in a similar way to proving a scientific
theory. This can be done through things such as the following:"," • Published research
• Competitive research
• Think tanks
• Adoption of the practice over time
• Acceptance of the practice over time
• Proven methodology
• The achieved value being returned from utilizing the practice","As you can see from the preceding list, proving a best practice takes a lot of time and effort
to collect and verify. This is why it is fair to say that most things lauded as best practices are
not likely proven to be so. But just because they haven't been proven to be best practices
doesn't mean that they can't be considered as such by the community. Now that
we have some level of understanding, let's look at the level of effort it takes to maintain
best practices.","Maintaining best practices","Benchmarking your processes and activities is your best bet to find and measure your","best methodologies. Benchmarking is just the process of measuring your stuff against","the leaders of your industry's stuff and comparing the results. Through repeated","benchmarking and research, you can maintain the effectiveness of each of your practices","and adjust and rewrite them when it's relevant.","So, we know how to maintain and review best practices, but how often should this","be done? The answer is… it depends. There are a few things to consider when you're","reviewing your current best practices and judging if you need to verify whether they are","still the best options for you. You will need to review the following:"," • How complex is the practice/methodology?"," • How fast-paced or competitive is your context?"," • Is the level of value declining compared to previous data?"," • Are your competitors outperforming you?"," • Is the increased efficiency/value higher than the cost to review and alter your"," current practice?","\f Best practices 71","Your answers to these questions will determine the level of frequency and level of effort
you will need to implement to maintain your best practices. The more questions that you
answer yes to, the more often and more stringent you should make your investigations
and research.
For example, let's say the following answers are provided for our currently considered best
practice multi-channel automation solution. We have a new approach that involves more
integrations and more capabilities for higher ROI, but is it worth it?"," • How complex is the practice/methodology?"," The methodology we are using is highly complex as it has to touch multiple"," databases with different architecture – requiring multiple custom integrations."," • How fast-paced or competitive is your context?"," The context is fairly fast-paced, but not excessively so."," • Is the level of value declining compared to previous data?"," Yes, the effectiveness of our data is in a steady decline and becoming less valuable."," • Are your competitors outperforming you?"," Yes, they have a fully multi-channel marketing journey set up and"," running effectively."," • Is the increased efficiency/value higher than the cost to review and alter your"," current practice?"," No; our new approach will triple the ROI of each entry into the journey.","As you can see, although there is a high ROI, there is a lot of investigation that needs to","be done due to all the yes answers. For the cost and level of risks that could be involved","in such a solution, you should consider whether this is worth implementing. So, I would","hold off for at least a couple of weeks by doing some heavy investigation and testing before","looking to implement or change anything officially.","Keep it simple, stupid (KISS)","Although keep it simple, stupid is the way most people have heard it phrased, my favorite","version of the acronym KISS is keep it straight and simple. I feel that this helps","emphasize the ideal solution, which is not just the simplest but also the most linear","solution. Sometimes, the simplest solution can also be the most convoluted solution in","other ways. By keeping it straight and simple, you should align to the goal rather than just","concentrating on simplicity. (Plus, by using this version, the acronym is not calling you","mean names and demeaning your intelligence... the big bully!)","\f72 SFMC Automation Best Practices","The strongest and most forward-facing solutions are the ones that act in the simplest and
straightest path. Although complex and highly intensive solutions and automation can
be hugely impressive, they also potentially open up significantly more risk since the more
code you have, the more likely you are to have something go wrong. Just because someone
makes this highly technical awe-inspiring code block does not mean it is elegant. It could
just be needlessly complex.","An example of KISS
A good example of the KISS theory is around over-engineering a solution. Sometimes,
a developer will get so obsessed with the technical capabilities that they will forget
about the simpler solution. For example, let's say that data is being imported into a data
extension where some manipulations are being done by SQL; we need this data to be
transferred from the data extension into All Subscribers.
As a developer, you may be in the technical mindset and thinking about utilizing SSJS and
AMPscript, or maybe even the APIs or WSProxy to do this… but there is a much simpler
solution. You extract the data on the data extension, and then import that data extension
into All Subscribers. This would not only be simpler but likely also require less processing
and incur less risk.","Complex versus elegant","Many people have a misunderstanding of what the sought-after solution is. The most","technologically advanced and innovative solution is not always the best solution. Most","of the time, the level of knowledge, skill, and capability that's required for these solutions","makes them very inefficient and adds massive resource costs. This does not mean that","the best solution needs to be dumbed down or simple, just that it needs to be something","accessible and agile. This is then the beginning of considering what is deemed elegant","versus what is deemed complex.","Complex
More complex solutions tend to have more points of failure than those that utilize a more
minimalistic approach. This is because complex solutions usually have more points of
action or activity that are susceptible to human error or other risks of failure. The more
moving parts something has, the more places that it can break. Now, complexity does
not necessarily require additional code executions – it could also refer to things such as
overly complicated instructions or overtly confusing or difficult to read coding syntax
and structure.
\f Best practices 73","Example of a complex solution","Utilizing something like the following is a possible solution:"," var arr = []
arr[0] = \"First\";
arr[1] = \"Second\";
arr[2] = \"Third\";
arr[3] = \"Fourth\";","However, this is much more complex and explicit than just doing the following:"," var arr = [\"First\",\"Second\",\"Third\",\"Fourth\"]","The reason I am confident in saying this is because it is utilizing constant or hardcoded","values and not dynamic or variable values. There is no reason to separate each value like","that to build the array when the second example provides the same information in a much","more efficient and understandable way.","Elegant
This is not to say that you should not explore creative and highly technical solutions; you
just need to make sure they remain straightforward. The best word I have found to explain
this concept is elegant.
Elegant solutions are those that provide precision and simplicity that are unmatched. To
achieve precision, you need in-depth technical knowledge and understanding to find the
absolute most efficient way to accomplish your tasks and activities. The more elegant your
automation is, the more power and return you will get from your solution. By keeping
things precise and succinct, you free up more processing to be used in other areas,
allowing for even more possibilities.
\f74 SFMC Automation Best Practices","An example of elegant processing","Let's build on the complex solution example we shared previously. In this example,","let's say that instead of a numerical order, we need to sort the array in alphabetical order.","There are some ways in which you could utilize a for loop or build multiple comparison","conditionals, and you could even use the more elegant built-in array.sort function:"," var arr = [\"First\",\"Second\",\"Third\",\"Fourth\""," arr.sort(function(x, y) {"," if (x < y) {"," return -1;"," }"," if (x > y) {"," return 1;"," return 0;"," });"," //returns [\"First\",\"Fourth\",\"Second\",\"Third\"]","The circle of development","All this talk on best practices, elegance, and complexity reminds me of a mantra that","I have heard over and over again throughout my career that has helped me grow to the","level I am at today:"," • How can you tell a junior developer from a developer?"," A junior developer uses very basic and simple code to achieve the solution.
A developer uses highly complex and technical code to achieve the solution.
• How can you tell a developer from a senior developer?"," A developer uses highly complex and technical code to achieve the solution.
A senior developer uses very basic and simple code to achieve the solution.
To help clarify the meaning of this circle, take a look at the following diagram:
\f Best practices 75"," Figure 3.1 – The circle of development","This is saying that you, as a developer, go in a circle of complexity as you grow. You start at","the basics in a super simple way, but as you grow, things become more and more complex","until you start to become a more experienced developer. Then, you can start reverting","to finding the most linear and simplest solutions again to ensure elegance. This relates to","automation as you may find these crazy awesome scripts that you can use to do amazing","things but find that just using a native activity or two instead would be more elegant","and effective.","Less does not always mean more (elegant)","One point to keep in mind is that less code, processes, or activities does not mean your","code is elegant. Through some shortcuts and hacks, you can greatly reduce the amount","of code or activities needed, but this can also greatly increase the required technical","knowledge, as well as the processing requirements of the code. This will make it more","likely for performance degradation to occur, as well as points of failure to increase.","Elegance includes precision and not just compaction and reduction. So, if you jump","through hoops and utilize some hacky behaviors to reduce your code base and character","count, you may be making your solution worse, not better.","\f76 SFMC Automation Best Practices","For example, you could write a script that utilizes dynamic variable creation to shorten
your script from 90 lines of code to 40 lines of code, but this will introduce highly
complex syntax, theories, and development theories. This limits those who can utilize and
understand this code. This also introduces risk as there is likely only a very specific way
it can be set up or it will fail – meaning that only those who are highly skilled and well
informed on the solution can edit, utilize, or maintain this code without great risk of error.
This is not how an elegant solution would be defined, despite being less than half the code.
Now that we have covered the basic best practices in Marketing Cloud regarding
automation, let's start exploring some more specific aspects. The first stop is going to be
learning how hugely important testing is for ensuring accurate and effective automation.","Creating high-quality testing","No matter how great you think your solution is based on the self-created test data and","reviewing you have done, without high-quality testing data, you can miss giant gaps in","your process. Testing and testing data are instrumental to releasing a high-quality and","performant solution.","There is no place that testing, especially high-quality testing, is more important than","building automation. To confidently set it and forget it, you need to know it is going to","work as you expect and that there are no unturned stones.","Unfortunately, there are some things inside of Marketing Cloud that make this a bit more","difficult than what you may experience in other platforms. Let's explore a few of those","woes next.","Testing woes in Salesforce Marketing Cloud","Inside of Salesforce Marketing Cloud, it is usually much harder to carry out quality","assurance (QA) and test your automation or processes due to everything being in a live","environment. This can lead to requiring you to publish and make your solution go live","to test it. This then opens the risk that the people outside of your organization or testing","group can access it before it is officially launched and finalized."," Quality Assurance
Quality assurance is a process in which a group or individual will review all the
aspects of a project or solution to validate that it is a high level of quality before
the solution or project moves forward.
\f Creating high-quality testing 77","There are different roadblocks or filters you can put in place to help dissuade this
possibility, but this also alters your test results from what would exist in a live
environment. So, how do we handle this?
Well, I have seen the following two methods being utilized the most to help provide
a testing environment before production implementation/launch:"," • Creating test versions in your production business unit"," • Creating a new sandbox business unit","Neither of these is a perfect solution and each has some drawbacks. But out of all the
solutions, I would say one or the other should fit your business need and help guide you to
your solution for testing. Now, let's explore these two options in more detail to share the
positives and negatives of each.","Creating test versions in your production business unit","I have found that utilizing some test automation that is separate from the live automation","you are looking to use helps remove any accidental actions during testing that can provide","production actions before the go-live date. For instance, this would require us to shift the","following aspects for our test version:"," • File Drop naming pattern"," • File Drop folder location"," • Target data extensions"," • Script target and source data extension or content"," • Email and content","My usual go-to in these situations is to duplicate the existing content, data extensions,
and so on. Then, I just prepend it with DEV-. So, if your data extension name is MyDE,
you would create your test data extension with the name DEV-MyDE. This prepending
allows you to easily see that this is a development data extension and not a live one. It also
provides the full name of what is used in production for easy reference, but differentiates
to remove any potential issues and risks associated with testing and development.
Now, although this is my recommendation, it can very easily bog down your business unit
with a ton of assets that may no longer be relevant and cause some confusion due to there
being multiple versions. The best way I have seen this handled is to create a good, solid
folder structure to account for this.
\f78 SFMC Automation Best Practices","A good example of this folder structure is to have a DEV folder inside your main project
folder that will store all the corresponding assets. Also, make sure that you mimic
this structure across each studio – for example, Email Studio, Content Builder, and
Automation Studio. From there, once you have all the assets, everything has been pushed
live into the correct folders, and once you have testing finalized, you can delete all the
development assets in those DEV folders.
Why would you delete it? Well, reducing the files that all share the same name will greatly
increase your capability to search for assets within the UI. It also cuts down greatly on
clutter and unnecessary file storage. Finally, it also removes the confusion of working in
DEV and Live and seeing which one has updated versions of what. The reason this works
so well is that it will also force you to copy off of the live version each time you need a new
dev version, meaning there is no possibility of old or otherwise incorrect content being
inside it. To help explain this, we will go through an example of this process.","Example of a test version versus a live version","Let's say we are working on creating a piece of automation that imports and manipulates,","via SQL Query, our customer information. Before creating the final production (or live)","version, we must create a development (dev) version to test with. So, for the test version,","we would start with the following:"," • Automation: DEV-CustomerImport_Auto
• Data Extension: DEV-CustomerDE
• Import: DEV-CustomerImport_Import
• Query: DEV-CustomerImport_SQL","Then, we would do the following for the live version (once development is done):"," • Automation: CustomerImport_Auto
• Data Extension: CustomerDE
• Import: CustomerImport_Import
• Query: CustomerImport_SQL","This would cause you to have eight different objects compared to the four that you need
for production. However, this would create a separate environment for you to work with
the code without affecting anything that's live. This offers a completely clean slate for when
you begin production, without any of the development history potentially causing issues.
\f Creating high-quality testing 79","Creating a sandbox business unit","Another popular option is to build solutions and automation inside a separate sandbox","business unit for all the testing and then port it over to the live environment once it's been","fully tested and approved. This is how most development processes build and it's tried and","true. However, in Marketing Cloud, it comes with caveats:"," • The sandbox is just another normal business unit:"," It still has access to All Subscribers and production data via Contacts.
It still costs the same amount of super messages that your production business unit
has in terms of emails, web page views, and mobile messages. This means that
Marketing Cloud will not differentiate your test sends from your live sends, so it
will charge you the same amount per send. This can add up quickly.
There is a potential added cost by requiring an additional business unit on
your account."," • There is no easy way to lift and shift your automation or build where there is no
risk of corruption or error:"," Deployment Manager and Package Manager is an option, but see the Deployment
Manager and Package Manager section for more details on that.
The level of effort for implementation and maintenance can be doubled.
QA and troubleshooting timelines will be doubled due to the two environments.
It increases the risk of potential failures when you're shifting across due to
human error.
This also entails risk as you have two places to keep updated or risk having an old
version overwriting the current, correct content/action."," • Effort is duplicated in terms of setup, data management, and administration:"," You will need to update the data architecture across both accounts in unison.
You must create all the APIs and similar packages across both and store each
for reference.
You must create test data for your sandbox that is identical to your current live
data to allow for the highest level of success in testing.
\f80 SFMC Automation Best Practices","Now, that is not to say in any way this is not a viable option – I just wanted to share the
differences from the normal sandbox methods compared to how it works in Marketing
Cloud. I have seen many major companies that utilize this methodology in their
Marketing Cloud environments. It is a secure way to ensure the separation of test and
development away from live production environments; it just comes with a heavy cost
increase if utilized.","Deployment Manager and Package Manager","These two applications exist inside Marketing Cloud, with the idea of being able to","duplicate certain builds, such as automation, emails, and data groupings, between business","units or enterprises. Although this sounds great, these tools have their limitations and,","in my opinion, they have more problems than benefits. This is not to say they cannot be","used or are not a solution for certain things – but like many aspects, it all depends on","your circumstances and a little bit of luck. Before we get too far into that, let's look at what","exactly each of these applications is:"," • Package Manager: Package Manager is a fully supported native feature of"," Marketing Cloud that is used to bundle items such as journeys, automation"," processes, content assets, data extensions, and attribute groups. These bundles can"," then be deployed to other business units or enterprise accounts."," Figure 3.2 – The Package Manager interface"," • Deployment Manager: Deployment Manager is a Salesforce Labs application that"," is not supported by Salesforce and is instead a use at your own risk application."," Deployment Manager is the precursor to Package Manager and does nearly the"," same things, including bundling objects and deploying them across other business"," units or enterprises.","\f Creating high-quality testing 81"," Figure 3.3 – The Deployment Manager interface","The major issue that I find with both of these applications is that the deployment part","appears to be somewhat fickle, to the point that even if you do not manipulate the bundle","that's created, it will still fail to deploy, which can then create more work and effort for you","to manually create a part of the packaged deployment. This can have a domino effect on","other objects.","I am sure that since Marketing Cloud works on Package Manager and gets it to be","a bit more stable and robust, it will be an amazing application, but until then, I would be","cautious to plan around it being successful and instead ensure you plan backups for","if it fails.","Now that we have explored the two major options inside Marketing Cloud, we will begin","exploring the development and testing that occurs outside Marketing Cloud, which is","then brought in.","Developing and testing outside Marketing Cloud","This one is the hardest to do, but if you have the tools and capabilities to do it, it can be","one of the best options. Essentially, to provide the best quality testing environment,","you must build an environment outside of Marketing Cloud that utilizes the APIs and","other integrations to mimic capabilities you would normally need from within the user","interface, such as email previews and running automation processes.","By utilizing an outside environment, you can store your tests locally, giving version","control that does not weigh down your instance, and you can do most of the testing","without even requiring anything to be hosted on the platform. This sounds quite amazing,","right? I mean, why isn't this the go-to for everyone?","\f82 SFMC Automation Best Practices","Well, the main reason is that the public APIs capabilities are limited. Without building
your tool to perform some hacky behavior on the user interface to utilize the internal
endpoints or functions of the UI, your environment will be super limited. There is also the
fact that you and your team would be required to utilize this tool and do nothing directly
inside of Marketing Cloud; otherwise, it could cause inconsistencies or misalignments.
This is on top of the huge development time to build it, plus the maintenance and
training/onboarding required for someone to utilize it, which can make the awesome
value it can add seem a lot less valuable."," Figure 3.4 – Example flow of external development and testing","I have seen some awesome tools that utilize this process, and they are very impressive.","It's hard to find a good use case that lets them provide the full value they should. This is","a shame, but a lot of the cool possibilities for integration via powerful API endpoints do","not seem to be a priority to Salesforce Marketing Cloud.","Now that we have covered the basics of best practices, including best practices around","testing, we should start investigating some of the larger topics surrounding automation","that need to be taken into account when you're building anything. Next, we will explore","performance and efficiency considerations.","\f Performance and efficiency in your architecture 83","Performance and efficiency in your","architecture"," \"Effective performance is preceded by painstaking preparation.\""," – Brian Tracy","I feel this statement is a great mantra to have whenever you work with any automation","in Salesforce Marketing Cloud. Without the proper planning and preparation, your","automation is not going to perform at peak levels, especially as your needs and capabilities","grow. I usually look to have equal parts planning and development to ensure that what","we are looking to build is the solution we need. I ask myself the following questions about","every project or automation I work on:"," • Will the solution solve the problem?"," • Is the solution your best path forward to meet your needs?"," • Does this solution consider all future scenarios?"," • Can the solution handle triple the volume and frequency that's currently expected?"," • Is the solution fully planned out and developmentally sound?","Now, these are not the only things to consider in your planning, but I find them to be the
most helpful questions to help get you moving down the right path. In this section, we will
look at each of these questions and dig into performance considerations in general.","Will the solution solve the problem?","You would be surprised how many times while planning that we get distracted by all the","bells and whistles and all the innovative solutions that are available that we stray from the","original need. The solution you build is amazing at doing quite a few things, but it does","not do what you originally needed, making it significantly less valuable – which means","you have to go back and replan and redevelop it. This can lead to very terrible final forms","as you will need to get out massive amounts of duct tape and bubble gum to make it work,","or just simply rebuild it from scratch.","By keeping your need or problem inside your planning phase, you can build a strong","base solution that provides you with the flexibility to build all those cool and innovative","possibilities on top of it. This allows you to retain the highest value return by ensuring you","keep your eyes on the prize and solve the issue that needs solving.","\f84 SFMC Automation Best Practices","Is the solution your best path forward to meet","your needs?","Just because a solution can resolve your need does not mean it is the solution you should","pursue. Context is highly important to the solution, and although you should ensure the","problem is solved, you cannot focus simply on that as the only priority.","By resolving an issue with a solution that does not fully consider the context, you","leave yourself with a siloed solution, creating gaps and disconnects from your other","processes and solutions. These gaps can then lead to a significantly increased risk","when you're implementing your solution. This added risk then decreases the viability","of the solution, meaning it may not be a solution at all. Let's explore the considerations","around integrations and interactions to help reduce this risk and ensure you have","a forward-facing solution.","Integrations and interactions","When you're building your solution, you need to consider how much interaction there","will be with existing systems. With these integrations and interactions, there will likely be","restrictions or certain requirements that need to be considered to ensure effectiveness.","Some of these restrictions can be things such as corporate policies. You will always need","to consider things such as network security, data limitations, and encryption, and certain","permissions or user restrictions will be required. This, along with the technological","limitations, such as governor limitations and the direction/flow of solutions, can have","a significant impact on your solution.","For instance, if you have specific naming conventions, this can limit your options by","reducing the options you have in terms of interacting with the names of each activity","or aspect of your automation. There could also be huge limitations if an integrated system","can only accept 100 records per second; you would need to account for this when you're","building your automation to ensure it never goes above that and prevents overloading.","Next, we will dive into how skill level and training play a huge factor in your solution.","Skill levels and required training/onboarding","Not only do you need to consider the capabilities and restrictions on the existing systems","and processes, but also on the personnel and their capabilities and capacity. Just because","a solution is available and is possible doesn't mean that you can build it, nor that your","team can use it properly.","\f Performance and efficiency in your architecture 85","By assessing your team's skill level, you can get a good feel for the level of effort it would
take to bring them up to speed on using the solution and the required level of effort for
onboarding any new users into this solution. For instance, having a command-line tool
built for a bunch of non-tech savvy marketers to use is likely to be a very long training and
ramp-up period that will require a ton of training sessions and support. This can greatly
reduce the value of the solution. Now, let's look at the effects that time and monetary costs
can have on your solution.","Time, cost, and effort to build and maintain your solution","So, now that we know it will work with our existing tools and that our team can use it,","we need to start making sure it's something that we can either build ourselves or whether","the cost of having someone else build it makes sense with the value it will provide. As","a previous mentor and friend always liked to say to me, You want to build something? You","have three options: fast, cheap, or good. But you can only pick two of them for any project.","Sometimes, building a solution internally makes the most sense as you can fully customize","and plan it out to fit exactly what you need. But along with this comes the time, cost, and","effort that's needed upfront to build it, as well as the time, cost, and effort to maintain","and support it. These costs can add up very quickly and, depending on your available","resources, this can take an extremely long time to get done.","So, by exploring this cost compared to the available vendors or agencies to complete the","solution for you, you can find which solution makes the most sense and adds the most","value to your organization. It is always worth considering all the building options when","you're planning on creating a tool because building it yourself, just because you can, will","sometimes make a solution turn from a good solution into a bad solution. To make a good","solution, we need to keep future scenarios in mind; otherwise, they could quickly become","outdated or need to be replaced.","Does this solution consider all future scenarios?","You should always consider the future state when you're making your plans. This means","that you need to consider things that you have not built yet and account for other needs","or issues that do not currently exist. If you do not prepare for the future, you will limit","your capabilities and require rework or coming up with more solutions at a later date.","In the next few subsections, we will dive into a few of the major considerations to keep","in mind while you're validating that your automation/solution has future possibilities","in mind.","\f86 SFMC Automation Best Practices","Future integration possibilities","When you're building a solution, you should always keep in mind that it will likely have","to interact or integrate with other systems – potentially, systems that do not even exist yet.","To that extent, you need to create some sort of service or connector that can accept and","handle these new systems.","Probably the most important part to consider is that if the service or language is old","or antiquated now, it will be completely useless in the future. So, when you create things,","you need to build them with newer services in mind. For instance, nowadays, you would","not build a service that utilizes a SOAP API connection instead of a REST one. Sure,","SOAP can still get the job done and many places do use it, but it is on its way out and to","build something on it is to put a definitive expiration date on it.","Handling capability growth and new additions","One thing to always keep in mind is that no matter how perfect or well thought out your","final solution is, there is going to need to be new updates or additions to it in the future.","This is inevitable and should be a major consideration in your architecture. Setting up","your solution to account for the ability to easily build on top of it will allow the solution to","handle future needs and updates better.","Also, by considering platform and capability growths in the build, you can provide","visibility for your solution. This versatility will increase the lifetime of your solution and","help it to be much more valuable.","Data analytics and reporting","This is one part that tends to be left on the back burner when it comes to solutions.","Without good analytics, logging, or reporting, the value of a solution significantly drops.","Most of the time, you will notice that solutions will be implemented. Then, there will be","a half thought out solution to handle reporting and logging the solution that was added at","the end when it was already mostly done.","Logging is especially important for solutions. There is no better way to debug and","troubleshoot a solution than to review the logs and find where the solution fell over.","Without a log, it can easily take 10 to 20 times longer to resolve these solutions, making","logs an invaluable resource.","\f Performance and efficiency in your architecture 87","Can your solution handle triple the volume and","frequency that's currently expected?","When you are building a tool, you need to go in with the idea that what volume and","frequency you see now is just a drop in the water of what the future will hold. If you do","not approach it this way, your tool will very quickly reach peak capacity and will need","a new solution to be built out, greatly reducing the efficiency and value of your solution.","My general rule of thumb for this calculation is to look at three times the currently","expected volumes. Not the current volume, but the expected volume you are getting from","your research for the future. Times that by three and you should get your solution into","a better place to ensure future capacity against future demand.","So, for instance, say that your current flow is around 1 million records per day into this","process. With the efficiencies of the tool and the increase in productivity and capability,","you can see it growing to 10 million over the next couple of years. At that point, my","recommendation is to take that 10 million and times that by 3. This means that the","capacity of your tool should sit at around 30 million per day.","The reason is that I have always found that the expected capability and capacity are","almost always going to be much lower than the actual need, no matter how much you","over-estimate. This is because, as things move forward, new variables are introduced, new","programs and marketing strategies come into play, contextual sales changes, and more","that can greatly change and affect the volume.","Using these along with just general technological updates and capability increases can","significantly increase flow beyond anything you could have believed in the initial plan.","Also, I have always found that having the extra processing power and not needing it is","much less impactful to value than needing more capacity and not having it.","Is the solution fully planned out and developmentally","sound?","This may sound like it should be common sense, and in many ways, it is, but it is","something I find is most often a problem when I'm creating a solution. The issue is","that with so many possibilities, considerations, contextual roadblocks, and integration","variances, you can easily lose track of the core of your solution.","\f88 SFMC Automation Best Practices","I have found myself getting lost in the details very easily and without someone to remind
me to focus and keep things on the right path instead of in the stars, my solutions would
have greatly suffered. To that extent, in any and every project I work on, I have checks
written in every step that simply say something such as the following:"," • Check to make sure this makes sense"," • Don't forget to build the ship before you design the deck","It just needs to be something to draw my attention outside of my blinders and let me step
back and evaluate the solution along the way. Without this reminder, I could have become
distracted and completely missed some very important details.
A great example of this is that I wanted to build a simple import logging automation/
script inside Salesforce Marketing Cloud. Now, anyone who has looked into this will likely
be laughing at me for my usage of simple there. This process is far from simple.
Multiple REST API endpoints are needed to log every import and file from import
activities. To that extent, I went down a deep rabbit hole around File Drop automation
and imports and wound up creating a big process on logging File Drop automation and
their interactions with the FTP, instead of anything related to my initial solution. I spent
nearly a week on a solution that there was no real problem to solve for, instead of working
on the requested solution. If I had put in my checks and notes to remind me to evaluate
whether this makes sense, I could have redirected all that time and effort and got the
solution built much quicker than I did.
The method you implement to do these checks is not nearly as important as the checks
themselves. Choose the method that works best for you. I have seen people use post-it
notes, checklists, digital assistant reminders, calendar reminders, and to-do lists. My
favorite is to grab a teammate or someone who is not a major part of the project and ask
them for 10 minutes of their time to go through all the current highlights and forward
paths you are working on. This gives you an outside opinion and by forcing you to say it
out loud, you can catch gaps or unintentional fallacies in your path. To help give more
context on this, we will go through an answered example of this process.","Example of performance evaluation","The automation in question is built on Marketing Cloud, but is pulling data from","a segmentation that's been made on the company's data lake. It's sending the messaging,","followed by immediate reporting and analytics being sent back to the analytics platform","via an API.","\f Is efficiency greater than performance? 89","Let's look at the questions we went over for this example:"," Yes. Through this, we can get the target audience we want to send into Marketing"," Cloud, send the dynamic messaging to the audience, and then pass that information"," to our analytics system."," Since the segmentation solutions inside Marketing Cloud are not optimized for"," this level of segmentation and data preparation, it is more efficient and effective to"," use a data lake. This, coupled with Marketing Cloud's capabilities with dynamic"," messaging and the integration with our analytics platform, supports this as the best"," path forward."," This solution accounts for all audience sizes and for capabilities to adjust the"," integrations as needed. There should not be a future scenario that cannot be"," accounted for."," With the power of data lakes and the power of the dynamic messaging throughput"," of Marketing Cloud, we can handle volume and frequency well above our"," current needs."," We have implemented similar solutions previously with high success and this"," solution has been fully mapped out and developed.","Is efficiency greater than performance?","Now, let's look at a question that does not have an answer – some don't even feel like","they make any sense. The first thing we should do is evaluate exactly what efficiency and","performance mean. If you look up the two literal definitions, you will notice that they","are quite similar. Does this mean they are the same thing, though? A few people I know","would vehemently say 100% yes, but I am not so certain.","\f90 SFMC Automation Best Practices","Why am I not certain?","Are they related to each other? Very much so. Are they, at the core, identical? No! There","is a fairly big difference between these two that is pretty well hidden in the language that's","used to define them. In my opinion, efficiency and performance can be defined as follows:"," • Efficiency mostly focuses on reducing waste and optimizing how resources are
used. It is aligned to find the best way to create a solution with minimal impact
or resource allocation.
• Performance focuses almost exclusively on the output. It is based on how much
you can get out of the resources you have. It is less concerned with waste and more
about the potential power you can get from your resources.","These are my definitions of these words. Through these definitions, I feel there is a better
understanding of the difference between these words. Using this understanding, let's take
a look at a solution that is performant, but not efficient:
Joe has a cabin in the woods and it gets extremely cold there during the night. He has to come
up with a solution to help keep him warm that is fairly easy to accomplish and effective.
As his cabin is in the woods and he has access to a ton of wood, he decides that the solution
will involve fire. With woodcutters, an ax, and a fireplace, it becomes obvious as to what his
solution should be.
His solution is to go out and chop down a lot of firewood and use that firewood in his
fireplace in his cabin. This will let him burn the wood inside his house, which will then be
able to retain the great amount of heat that's put out by the fire. All the waste (smoke and
ash) will go out through the chimney.
Pretty smart, right? Well, it is certainly a performant solution that outputs a lot of heat in
a way that can help keep Joe more than warm enough across each chilly night he is there.
With his abundant resources, he does not need to worry much about running out of fuel.
But this is not the most efficient way to resolve this.
By burning through at least a couple of wood blocks each night, he is burning through
his fuel at a fairly high rate – it is only the abundance of the fuel source that makes this
efficient. If he wanted an efficient solution, he would look to call in someone to insulate
and seal his home first. Even then, he could look up things such as geothermal heating
or other potential solutions that offer up to 70 or 80% higher efficiency of resources than
a wood fireplace could offer.
\f Is efficiency greater than performance? 91","Is efficiency better than performance?","Yes, but also no. The answer to this question depends on what you need your solution","to do. If you, like Joe, have an abundance of the necessary resource and are not worried","too much about making sure that you get the most out of every piece, then performance","would likely be your priority. However, if resources are scarce, it would make more sense","to make sure that you get every last bit of productivity and value from your resources,","regardless of whether it causes a slowdown in output.","There is no definitive answer to this question, but I would say more often than not that","most people tend to place a higher value on performance than efficiency. This is best","seen in the preceding example story – Joe did not want to spend those extra days, dollars,","and effort getting all that work done upfront for a highly efficient solution; he went with","performance to get his solution up and running at a quicker and lower upfront cost.","Finding efficiency is a significant effort upfront as it usually requires quite a bit of","preparation, research, and investigation. It can also have the largest impact on the current","process and methodology. Although it may be a better solution in the long run, the","substantial costs and risks upfront can be far greater than the long-term values it can offer.","I do want to emphasize that a lot of the time, this determination is not people being lazy","or rushing to just get something in. The decision is based on a ton of calculations and risk","analysis. Sometimes, there is an obvious solution that is by and large the best one in the","long term, but it comes with a lot of requirements. The following are some examples:"," • It could take up to a year for it to be up and running and begin creating any value:"," Increasing risks, due to the length of time to implement compared to the"," possibility of changing the environment or need."," Increasing the starting costs exponentially due to the high cost for the first year"," of work with no returned revenue counters this."," • It could require you to restructure the current methodologies and processes"," before they're implemented:"," Increases risks as this restructuring could cause other issues that can incur more
costs or effort
Potentially high costs for rebuilding and training these new methods
Lower efficiency during the implementation of these new methods due
to unfamiliarity
\f92 SFMC Automation Best Practices"," • There is an increase in cost for other partner systems to implement:"," If it requires other partner technologies, you could be looking at having to buy"," or build solutions for things you already have solutions for, thereby increasing cost"," and risk.","With all these added to the calculation, that gap between solutions can substantially
decrease. I have cases where the most optimal and efficient solution would take nearly 10
years of running to just pay off the costs and catch up if we had just implemented the other
solution. I can tell you that in 10 years, if not before that, the solution would likely be
outdated and need to be re-evaluated. This means that it would be the less valuable option.
Now that we have a strong background and have taken a deep dive into best practices
and the best practices in Salesforce Marketing Cloud around automation, I want to start
investigating how to use them. There are many considerations around utilizing a best
practice beyond just implementing it as-is.","Best practice is not always best","We have a ton of background and strong knowledge in the dos and don'ts of Marketing","Cloud automation. Great! Progress! But now, I want you to put all that to the side so that","I can make this very important note around all that information.","These best practices are not the be-all solutions for every automation process. Sometimes,","you need to toss out best and use your solution. You will find that what works for 9 out of","10 people is not the best solution for your needs. There are a ton of things that go into","a solution and due to all these variables, best practices cannot account for them all.","Think of best practice solutions to be more of guidelines or boilerplates for you to use.","From these, you can get a head start on building your solution and then just adjust, add,","and subtract as necessary to meld it to fit your unique situation. To that extent, I want to","recommend the following things you should keep in mind whenever you build from","a best practice solution:"," • The context of your problem and solution is paramount to a good solution."," • The smartest person in the room is not always right."," • Always test and research everything yourself – don't trust external results.","By remembering these three key topics, you should be able to build out and strategically
build a solution that best fits your needs. Best practice does not mean it is always the best
solution, just that it is the best general method to get these general results.
Let's look at those three topics in more detail.
\f Best practice is not always best 93","Context matters
One of the many things you hear tossed about is that No matter what you do, someone else
has already done it. I always hate this saying. Sure, something similar has happened before,
but it was not right here, right now, in this very specific way. Just because it is very closely
related does not mean it is identical.
Context is one of the most important things to consider in nearly everything. Even
something as simple as sitting a foot away from where you are can completely change the
context and understanding of a situation. Let me explain.
Story 1 – John's point of view
John is sitting down next to Jenny and Sam is sitting on Jenny's other side. John is
watching a fireworks display and Jenny leans over and blocks his view as she digs into her
purse. John missed the big finale, so he was not happy with the fireworks show since the
rest was pretty boring.
From Story 1, we can see that John was not happy with the fireworks show since it was
pretty boring for the most part. He did miss the big finale though because Jenny was
leaning over in front of him to dig into her purse for something. Because he only missed
a little bit of the fireworks show, he assumed that he likely did not miss much and
concluded the show was bad.
Story 2 – Sam's point of view
John is sitting down next to Jenny and Sam is sitting on Jenny's other side. Sam is
watching a fireworks display and he doesn't even notice when Jenny leans over in front
of John to dig into her purse as the big finale begins. Despite the rest of the show being
mediocre, the finale was the best Sam had ever seen and he left the show amazed with how
great it was.
From Story 2, we can see that Sam agreed that the show was not going great for the most
part, but that the finale was amazing. After seeing such an amazing couple of minutes,
Sam left the show feeling very positive about his experience and the show in general.
This is a great example of how the context of a couple of feet can completely alter the
information and understanding you have of a situation. Due to these differing contexts,
John may never come to this show again and never see the amazing finale; he may
potentially even poke fun at Sam for how much he loved it. On the other hand, Sam will
see it every year and not understand how John could think so poorly of it.
\f94 SFMC Automation Best Practices","Now, we can see how important context is. What may be the best solution in the world
to others can be a John and Jenny moment for you, where the amazing ending cannot
be watched, so the solution is not for you. That does not make you wrong and the others
right. It just means that in your situation, it does not make sense to use that solution.
The story of your needs and your solution is as important – if not more so – than any
recommendation or best practice documentation you can find. Just because someone is
a master in their field or highly respected does not mean that they know the entire
situation and that you should trust them more than yourself.","Subject matter experts don't know everything","One thing that most successful automation campaigns need is subject matter experts.","These are people that are intimately knowledgeable about the subject matter at hand. In","this case, it would be experts in Salesforce Marketing Cloud and automation within it.","Subject matter experts are a valuable asset and a great resource for platform knowledge,","but they are not infallible. Just because an expert tells you to do something in a specific","way does not mean that is the best or only way to get it done. The level of intelligence,","skill, and experience in a platform, while impressive, does not give you a card that means","everything you say is accurate no matter what.","My main focus in this section is to make sure you keep in mind that when you call in an","expert, them being the smartest person in the room does not mean they are always right.","That being said, though, you shouldn't completely ignore any expert. They got that status","for a reason and probably have some valuable insights that you can use.","The focus here is to ensure you fully explore what is being said by the export and verify","that it is something that works for you and your needs, instead of blindly accepting best","practices. This extra due diligence exploration can wind up saving you tons of money and","time that could be lost due to bad advice from an expert.","Smart does not mean right","There are plenty of times where I have worked with a client or helped consult someone","in one of the many forums I frequent where I have been recommending one approach,","only to find out that the way their data feed is set up, their systems are built, or the way","their company policies dictate make my solution impossible or more difficult. This means","there's more of an upfront cost than they may be willing to take on, which means my best","practice solution sucks.","\f Best practice is not always best 95","So, despite being a pretty smart guy with a lot of experience in Marketing Cloud and
similar platforms, I was wrong in what I recommended! Best practice and platform
knowledge can get you most of the way there, but without the full view of the situation
and the nuances of the context, we will only get so far. Think of best practice as more of
a great starting place that you fill in the blanks of with your data and shift it to best fit
your needs.
I am sure everyone here can commiserate with this. That smartest person in the room has
to show off how much more intelligent, talented, and skilled they are than everyone else.
These people are fairly rampant in the technology world and unfortunately, they tend to
be the ones that make a name for themselves. Luckily, in the Marketing Cloud community,
it seems like this is less prominent than in most other technology communities.
This self-proclaimed smart person will fight tooth and nail to prove that whatever they
said is right and best, even if it requires them to make stuff up, bully you, or attack you
personally. As soon as someone starts doggedly fighting you about which solution is better
and is not just discussing or arguing, then you know you can stop the conversation and
look elsewhere. An expert is someone smart enough to know a lot, but also smart enough
to realize that they also don't know everything. There are tons of people out there that
know more about those things than they do.","Do not ignore the actual smart people","I am certainly not telling you that subject matter experts are not to be respected and","listened to – far from it! I am just saying to take what they share as the advice and","guidance it is, and not as a complete or singular solution. Combine that advice with your","knowledge and understanding to find the best solution for your needs.","Subject matter experts are great collaborators and consultants as well! The more you","work with them and discuss the context, the more they can get a feel for your needs and","help guide you to the correct solution. I know tons of companies that will pay very smart","people a bunch of money to sit and give them best practice advice for a few months on","big projects or initiatives they want to implement. I can tell you that almost every single","company has come back to say the cost was well worth it. Most of those that say it was not","either hired the wrong person or refused to partner with the consultant properly.","As I mentioned previously, bringing in an expert as a consultant is something that is going","to take a lot of time as they will need to fully digest the entire context and situation of your","organization. There is usually a lot of things that need to be considered, researched, and","tested before you can align with a suggested solution.","\f96 SFMC Automation Best Practices","Do research and testing","So, there is a ton of information out there on nearly any and every topic you can think of.","This information is all at our fingertips and can mostly be accessed for free or at a fairly","reasonable price thanks to the internet. We live in an information age that helps make","research and investigation easier than it ever was before.","If that is true, then why would I tell you to do research and testing? Well, because despite","there being a ton of information, not all of it is relevant or related to you and your","situation. Some of it may look related, but it can give contradictory information to your","actual situation, causing massive confusion and potentially causing future issues","or a reduction in quality.","In the infamous words of Thanos, Fine. I will do it myself. However, here, we are not","aiming to collect Infinity Stones or talk about the failings of our subordinates – we are","talking about doing first-person data analysis and research. Plus, I would say that most of","us are not purple and gigantic. Other than that, we are very similar.","Benefits of doing research","There is a level of understanding and comprehension that's gained from doing research","and testing to find a solution that just cannot be matched with grabbing data from other","resources. Knowing how your company and its existing solutions work and taking that","into account can take a good solution and make it great!","Not only does your understanding and comprehension grow by doing research and","testing, but you can get different viewpoints and perspectives through your knowledge","gains. By doing some legwork, you get exposure to many things you may not have seen","otherwise and can dig up hidden landmines or opportunities to help guide your solution","to an even better and smoother outcome.","Test yourself before you wreck yourself","Testing on your own, whether through proof of concepts or actual solution testing,","is a great way to ensure the optimal solution is in place. I have found that even if you are","going to hire an agency or contractor to build your solution, it is best to build a small","proof of concept yourself first to get a feel for what is needed.","When you're building this concept, it will help you gather insights into the systems that","will need to be included or integrated, all the associated data and company policies, and","so on that would need to be considered in your solution. The more information and","guidance you can provide for the build team, the better the results will be. This, combined","with end-to-end testing, will get you a level of scrutiny and familiarity that can lead to not","just a strong solution but a much better maintenance and upkeep methodology.","\f You (your preferences) matter 97","Doing testing once the solution is completed is paramount as well. It not only ensures
that the solution was correctly built and works as expected, but also gives your team
much-needed familiarity and experience with the tool. This is especially important when
you're utilizing contractors or agencies for solutions. If the people that built the solution
will not be around in 6 months or so when something will most likely go wrong, you need
to have the knowledge and familiarity with it to be able to resolve it yourself.","You (your preferences) matter","Two things that are probably the most important thing to be considered when you're","planning and building are your preferences and insights. It is so easy to be distracted","or lost in the millions of different references, resources, and investigations. You need to","keep your head above water and always remember that when push comes to shove, this is","something that you and your team will need to build, maintain, and use. So, you need to","be considered in the solution to make sure it's optimal for you and your team.","This can be a tough thing, especially for someone who may not have the confidence","or outgoing personality to speak up, but you need to speak up for yourself and make","sure you are included in the solution. The biggest roadblock to this is the dreaded","imposter syndrome.","Imposter syndrome
Imposter syndrome is the state in which someone believes that they do not deserve
to be where they are. In this situation, you may feel that those around you are more
accomplished and realize how out of your league you are and shun you. It is something
that is rampant for many people and it is something that needs to be addressed.
One key to success is knowing yourself and recognizing your weaknesses, but the other
key is to not let those weaknesses hold you back. It is fine to feel insecure at times and
feel scared or nervous, but it is when you let those feelings stop you from stepping up and
doing the thing that you want to do that it becomes an issue.
This requires a certain level of maturity and confidence. The best way I have found to
reach this level is to take a deep look at yourself and learn to trust yourself, let others tell
you no, and not to pre-emptively limit yourself. By freeing yourself from your limits,
you open up more possibilities and solutions for your automation due to a reduction in
self-doubt.
\f98 SFMC Automation Best Practices","Trust yourself
This one always sounds like it's so easy, but it is a very difficult thing to do. There are all
kinds of self-help books, classes, and philosophies dedicated to just this. To that extent,
we know it is far from easy and requires a lot of work. By trusting yourself, you also allow
yourself to make decisions faster. You allow yourself to follow your gut instinct instead of
potentially overthinking things and slowing down a process. Overthinking can also lead
to over-complicated solutions or talking yourself out of the best solution just because you
felt like your decision could not be right.
You also won't be able to fully digest and understand the current state situation you are in
without being able to trust the information and research you have gathered. You need to
have confidence in your abilities to fully assess your situation, which is one of the major
factors that's required to build an optimized solution.","Your situation matters","An amazing gift in many circumstances can be a complete insult in another situation.","Where you are, what you are giving or doing, and who you are giving it to can greatly alter","the value of the gift. There are times when the same item and the same place but different","people could drastically alter the value, where one is an amazingly heartfelt gift and the","other is a very hurtful insult.","For instance, to many people in America or England, getting a new watch can be","a wonderful and thoughtful gift, but if you were to give that same gift to someone in","China, it could hold an ominous meaning and would be considered insulting. In China,","the gift of a watch or a clock can be perceived as telling someone that their time is running","out and can be considered a grave insult. This is an example of how vastly important","context and situations are.","By putting in the time and effort to fully assess the situation and utilize that knowledge in","your solution, you can greatly optimize your results. Through this effort, you can easily","discern which options make sense and which ones, such as the watch, are a bad idea.","Although many people mention due diligence as a good measure of the level of","investigation you should do before you create a solution, I disagree. I think that is the bare","minimum amount that needs to be done. You need to not only know the situation but also","fully understand it to succeed. To do this, you need to go beyond just the surface and dig","into what everything is and what that means. With an in-depth understanding, you will be","able to find many different things that can greatly affect or alter your initial solutions. For","instance, you could find things such as hidden land mines, roadblocks, and shortcuts that","could greatly reduce or increase the level of effort necessary.","\f Summary 99","The other benefits you get from your understanding of the situation is that you learn
places where you can use existing structures or processes to build upon, reduce the
requirements of your solution, and recognize weak points inside your infrastructure that
you can include when you're planning your solution.
Using knowledge of your systems, platforms, policies, and more will get you more access
to ways in which you can solve using existing processes. For instance, I once built an
entire email library UI based on using the Marketing Cloud API, cloud pages, and data
extensions, as well as utilizing an in-house data platform and content system to take
on some of the heavy lifting and retain version control beyond just the current version.
Without this knowledge to use the existing data platform and content system, the solution
would not have been nearly as smooth and may not have even been possible due to the
heavy processing required.","Summary
Wow! We made it through the setup and the best practices section! Congratulations!
Through the past few chapters, we have learned so much, starting with talking about
automation and then the best practices in automation for Salesforce Marketing Cloud.
Throughout this chapter, we have learned about many things concerning automation in
Marketing Cloud, including the building-block best practices of automation in Marketing
Cloud; how to create high-quality testing to maximize your solutions; figuring out the
difference between efficiency and performance; learning that best practice does not
always mean the best solution; and that your preference, context, and opinions are
major considerations.
With these topics under our belt, we can confidently close this section of this book.
These topics will guide us in our future endeavors in automation by creating a strong
understanding and building block that will form a strong base in our capabilities within
Marketing Cloud. From here, we will start investigating automation capabilities inside
Salesforce Marketing Cloud and how best to utilize them.
In the next chapter, we will begin diving deeper into building and utilizing automation
inside Salesforce Marketing Cloud. This includes use cases and examples, as well as
in-depth explanations and analyses. First, we will dive deep into automating email sends
inside Salesforce Marketing Cloud. This topic is a staple in marketing automation tools
and is a good starting point.
\f Section 2:
Optimizing
Automation
inside of SFMC","Now that we know the basics of automation and how to use automation in SFMC,
we will explore more details on best practices, implementation options, optimizations,
and advanced usages, all within the platform.
This section contains the following chapters:"," • Chapter 4, Automating Email Sends"," • Chapter 5, Automating Your ETL and Data"," • Chapter 6, The Magic of Script Activities"," • Chapter 7, The Power of In-Step APIs"," • Chapter 8, Creating Mini Web Apps","\f 4"," Automating"," Email Sends","Automation best practices are always a great place to start from, and now we have a lot","of understanding of these general best practices and how to use them. Even though best","practices are not infallible or ultimate solutions, they are great ways for pointing you in","the right direction or for having a boilerplate solution to build off of.","To build off that strong base, we are now going to start exploring more in-depth and","detailed information on specific topics of automation in Marketing Cloud. Our first stop","in exploring automation in Marketing Cloud is email messaging automation. In this","chapter, we will dive deep into what email messaging automation is and how to use it to","your advantage.","Email messaging automation is almost universally the most popular automated activity","across all messaging platforms. I mean, who wants to have to go through the weekly","process of setting up a message, validating it is set up correctly, and then sending it over","and over again when you can just do it once and then set it and forget it? Certainly not the","majority of people, that is for sure.","\f104 Automating Email Sends","Our focus is going to be on email in this chapter as it is a staple for messaging software
and the base on which Salesforce Marketing Cloud and Exact-Target were built, but most
of the following principles and actions can be applied to other messaging mediums. In this
chapter, we will explore the following:"," • Email marketing automation overview"," • Real-time versus scheduled"," • 1:1 messaging versus batch sends"," • Analytics-powered insights"," • Considerations regarding email automation","Now, as mentioned precedingly, there are a couple of places where we dive deep into
theories on messaging automation strategy as well as some general considerations you
should keep in mind while automating your messages. This is because a lot of automation
is contextual, so that can mean subjective opinions and outside influences will affect each
automation and need to be included in your solution.
At the other end of the scale, however, there is also a section talking about the utilization
of analytics and tracking information to provide more scientific insights. Through
a combination of science and art, we usually find effective marketing automation.
As a note, this chapter starts with the assumption that you have some basic background
in email marketing and the capabilities and functionality of Salesforce Marketing
Cloud in relation to email messaging. If you feel you are missing some aspects,
I would recommend checking out Trailhead, by Salesforce, or one of the many blogs
or communities out there to get some background concerning Marketing Cloud.
First, let's dive into the basics of email marketing automation and the benefits it provides.","Email marketing automation overview","Email marketing is one of the oldest and strongest relationship marketing strategies","available. With the right cadence, elements, and targets, your campaigns can provide","amazing returns on your efforts.","That brings us to the question of what is email marketing automation? It is the act of","fully preparing and sending out email campaigns programmatically based on triggers,","categories, and/or schedules that are set by you.","\f Email marketing automation overview 105","Automating your promotional campaigns is useful as it saves marketers from needing to","create and send email jobs each time a need arises. By creating an automated process, you","greatly reduce the need for manual action every time an email is ready to be sent thereby","reducing the draw on your staffing resources and a drop in the time it takes to get each","email campaign created.","Next, we will go into detail about how email automation is very different from email","blasts or batch and blast methods. These are more one-off email broadcasts that are sent","manually to all of your subscribers. Email marketing automations are more aligned, with","a smarter, permission-based marketing strategy, and have vastly increased performance","and return.","The main benefits of email automation are as follows:"," • Reduced manual effort/wasted time: By automating aspects of your process,"," you reduce the number of tasks that your team needs to accomplish or validate"," manually, thereby reducing the draw on your team to complete tasks manually and"," freeing them up for other tasks."," • Personalized content: Because of the automation capabilities, you are able to"," personalize and dynamically fill in each individual email through scripting while"," retaining the singular build and send approach, making each email personal"," without the added efforts."," • Improved scalability of your marketing strategy: As your campaigns grow, so too"," does the cost and level of effort required to execute them. By automating, you can"," spend the same time sending an email to 10 people as you would to 10 million."," It also can allow you to loop messaging together in linear or journey pathing options."," • Reduced costs and time to deployment of your campaigns: Through automation,"," you reduce the level of effort and the number of resources needed to deploy your"," campaigns. This allows you to save costs on resourcing and reduce the time required"," to execute the campaign.","Through these benefits, you can greatly increase the ROI of your email campaigns and be
able to take them to the next level.
Now that we have the basics of email marketing automation covered, let's explore the
different opportunities we have for email message automation inside Marketing Cloud.
One of the first places we see multiple opportunities is around the execution type and
timeline of messages. This then begs the question: What is the difference between real-time
and scheduled messaging strategies?
\f106 Automating Email Sends","Real-time versus scheduled","As you will likely consider as you read on, this could also be phrased similar to a section","in Chapter 2, SFMC Automation Tools, as Comparing Journey Builder and Automation","Studio. But even that is not fully accurate as there are ways to do both strategies in","each tool. For instance, there are ways to do scheduled email sends inside Journey","Builder through different Wait Activities and Entry Events and you can use File Drop","automations in Automation Studio or even Triggered Sends and Transactional Sends","(from Email Studio) to perform real-time messaging.","Although both strategies are available in both tools, each tool is explicitly designed","to specialize in one or the other. Journey Builder can most definitely do scheduled","sends through different entry event or wait options, but it is specifically designed to be","a real-time marketing strategy tool. Similar to that, Automation Studio has options for","real-time delivery/action, but it is most definitely optimized to run more with a scheduled","or recurring marketing strategy.","Before we dive too much deeper, let's explore what exactly real time is.","Real time
Merriam Webster's dictionary defines real time as the actual time during which
something takes place. Now, this is great as a general definition, but this is not what
I would say is a real-world definition of real time in relation to email marketing or even
marketing in general. Usually, there is a certain gray area that exists around this definition
as real time is not always immediate. Even if a couple of hours have passed since the
trigger, that can still be considered real time in some contexts. Odd, right?
Well, the difference revolves around the path the person takes to get there inside the
automation. For instance, it could be considered real time because prior to that, it had
to go through a bunch of different data systems and warehouses where it was validated,
cleansed, transformed, and filtered prior to reaching the send activity. So, although the
actual send was delayed, the process was initiated and was run in real time.
Another option is to include a purposeful delay prior to the messaging being released.
I know that for things such as abandoned carts or similar reminder-type messages, there is
usually a well-curated waiting period within the path used so that although they enter the
path in real time, the message is not sent until a pre-determined wait period afterward.
Now that we have a good idea of what real time means in relation to email marketing,
let's take a look at learning what options we have in utilizing this strategy.
\f Real-time versus scheduled 107","Real-time messaging options","As I stated earlier, this capability is not limited to just one tool or the other, but there","is definitely one that is optimized for this specific type of execution – Journey Builder.","Journey Builder is a tool designed to optimize the handling of real-time, 1:1","messaging needs.","Outside of this, other options sit outside of the two tools that can be used, such as API","interaction triggered sends and transactional sends. Each option that is capable of being","real time are the following:"," • Journey email activity"," • Triggered sends"," • Transactional sends"," • Journey Builder transactional journeys"," • Automation Studio Send Email activity","Now that we have the list, which I must admit is not exhaustive, let's dig more into each
option and get a better feel for it. We will start at the top and explore the Journey
email activity.","Journey email activity","Journey Builder, by default, utilizes a form of triggered send as its email activity, meaning","it is triggered immediately per individual and not stored and sent later. This setup allows it","to be more agile in terms of the speed when a message is executed, creating fewer hurdles","to getting the email out the door.","These messages are triggered by an action that then makes the system send the message","out to the corresponding data source. In Journey Builder, the trigger is made via the","journey path as it moves through – pushing each record individually onto this activity at","the time of activation. This is slightly different and a less technical approach than what","you would find in the regular triggered sends.","Triggered sends
A triggered send is something that has been around in Marketing Cloud since when it was
still ExactTarget. This was the original way in which the platform could offer real-time
messaging. Triggered sends require an API call to be made that contains all the relevant
information in its payload. This allows it to align the sendable data with the correct email
definition, and so on, so that it can immediately send the email out.
\f108 Automating Email Sends","These are very developer-heavy options as there is no UI attached to execute these (as
we are separating out triggered sends from Journey Email activities). So, it requires you
to develop an outside service to make an API call to a specific endpoint with a specific
payload in order to have it execute. This can make it a bit unwieldy for some.
Next, there is the transactional send, which is basically the triggered send's
younger brother.","Transactional sends
The transactional send is a newer version of the triggered send. It is basically Marketing
Cloud's response to SMTP (Simple Mail Transfer Protocol)-based email services that are
streamlined for speed of execution and delivery. According to Salesforce, these offer much
faster processing, higher priority, and reduced queueing services compared to
triggered sends.
Despite all this, much of the usage and perceived execution is the same. You still need to
trigger this via an API call and it needs to contain specific information to connect it
to the sendable data and the email definition. After this API call, it still appears very
similar in the process as triggered sends as it creates a new record in the corresponding
data extension."," Figure 4.1 – Example of workflow for the transactional email API","The main difference comes in through setup and execution. The email definition and","other setup needs have to be done via API, not the UI. This means that you cannot use the","UI to identify what existing transactional emails you have created, or that are running","or paused. You will need to review all of this as well as interact with them via API only.","This can be very burdensome and may require building an external tool to interact with","this through its own custom UI to allow for fewer technical resources to be required to","do this.","This can be a huge task that potentially could be looking at a significant effort to","implement. To make life easier for non-technical resources, Salesforce created the","transactional journey.","\f Real-time versus scheduled 109","Journey Builder transactional journeys","Journey Builder transactional journeys basically just use Journey Builder as","a UI for the transactional send APIs. It provides all the same results and benefits of the","transactional sends, but with a UI so non-technical resources can use it.","That does not mean that it is a 1:1 comparison though. There are a lot of capabilities and","customization that are available via the API that are just not possible inside of the journey.","The API also has a stronger throughput due to the streamlined approach. Now, the","advantage to the journey, outside of the UI, is that tracking and analytics are available on","the journey, and this is not easily available via the API version. Next, we move to a more","batch-oriented, but still potentially real-time, messaging option.","Automation Studio Send Email activity","Admittedly, nine times out of ten, this activity would be used in a scheduled environment,","not real-time, but it is possible to use it that way. For instance, if you use a File Drop","automation to drop an audience, this activity will send the email upon dropping/","importing, instead of waiting for a scheduled time/date.","You would set this up as a fully defined object prior to creating the automation; for","instance, selecting the sendable data extension, the email creative, the subject line, and the","sender profile. After this, you just slide this into a File Drop automation after an import","to grab the most recent information that was dropped, and then your email will be sent","within minutes of the file landing on the SFTP."," Figure 4.2 – Diagram of each of the pieces of a File Drop Automation email send
Now that we have a pretty good handle on real time and the messaging options associated
with it, let's explore what exactly scheduled means.
\f110 Automating Email Sends","Scheduled
Merriam Webster defines scheduled as a procedural plan that indicates the time and
sequence of each operation, essentially saying that it was set aside until it hit a pre-planned
time and date where it was then executed and sent. Now, this might sound similar to what
we mentioned earlier about real time, but it is actually quite different.
For instance, the idea of real-time pathing with a waiting period can sound like scheduled
messaging, but it is not. It is different because each message is entered in real time, and
therefore, the time the message is actually sent out is different, just the waiting period on
each is the same. In a scheduled send, the audience is usually created prior to the send and
the email is usually sent in batches.
You may recall that I mentioned in the beginning that scheduled messaging was possible
with a Wait activity, but now I am saying that it isn't possible? Huh? How can it be both?
Well, if you set a specific date for everything to be released, then it is scheduled, but just
putting a waiting period of 2 days would just be a delay in a real-time message, not
a scheduled send.
Something scheduled is also usually a more repeatable task, similar to that weekly email
send example at the beginning of the chapter. Sure, there are times that the email might
have changes to the content and sure, the audience might have some subscription changes,
but in general, it is nearly identical from run to run. This consistency and repetition are
the major key features and defining factors in scheduled messaging.","Scheduled messaging options","Similar to real time, this is not limited to just one tool or the other but is definitely one","that is optimized for this specific type of execution – Automation Studio. Automation","Studio was created with the idea of creating automated solutions for repeatable,","procedural tasks, which, as you can see, is very much the definition of scheduled.","Each option capable of being scheduled is as follows:"," • Journey Builder email activity"," • Triggered sends/Transactional sends (kind of)"," • Email Studio Send Wizard"," • User-initiated send definitions","\f Real-time versus scheduled 111","This list, much like the real-time one, is not exhaustive, so you may find other possible
options that can be used as scheduled. This is just a list of the majority of the options
and the ones that are easiest to implement.","Journey Builder email activity","The Journey Builder email activity has three major ways in which you can accomplish","scheduled messaging in Journey Builder. Well, to be fair, there are other ways, but they are","far outside of best practice and mainstream needs. The first one we will be going over is","utilizing an automation from Automation Studio as the entry point.","This entry source is actually part of the Data Extension Entry Source. Inside this entry","source, you would have three choices, one of which is Automation. Upon selecting this,","contacts will be admitted based on the schedule of the automation from Automation","Studio. This automation would require an import, filter, or query activity to apply so that it","can correctly adjust the data prior to execution. You would also need to establish whether","you want this to evaluate new records only or all records each time.","The data extension entry source using automation is very similar to utilizing a Send Email","activity in Automation Studio, which we will go over later in the chapter. Next, I am going","to be focusing on the same entry source, but in a different way.","Sticking to the Data Extension Entry Source, next, instead of Automation, we will be","exploring the Recurring option. This can be used to set a scheduled send by including","a column with a send date in it. From there, you would just set the entry filter to be","something like SendDate equals Today and as it is recurring, you can then later update this","data and with it being recurring, the emails can send again at a later date. This is especially","useful for dynamic weekly emails. Next up, we will be exploring an option that is a tad","more out of the box.","Basically, you would have everything ingested into the journey through the normal real-","time entry point (API, Data Extension, and others) but, prior to any email messaging","send, you have a wait event. This wait event is a specific rolling range date/time so that","it will only send all those queued in the journey during that specifically scheduled","time period.","Now there are two ways you can do this. You can have a wait period where you manually","set the specific date and time on each send via a Wait until Date activity. Or, with a bit of","fancy footwork, you can make this date dynamic by using a data extension to hold the date","of sending. You would map that data extension to an attribute group used by that journey","and then update the date in that data extension to the next time you want the email sent,","after each send.","\f112 Automating Email Sends","The cool thing about this approach is that if you use contact data for this Wait By
Attribute activity, whatever date is the value when the subscriber hits that activity is the
date used. So, if you need to change it a bunch of times throughout the journey, up until
they hit the Wait By Attribute activity, those other values won't matter. That being said
though, once it hits that activity and begins the queue/waiting period, it can no longer be
changed or adjusted even if you alter the value in the data extension. Basically, what this
means is that when a subscriber enters the journey, they might have October 11th as the
date value in the contact data that is used by the Wait By Attribute step, but they are still
around 3 days away from hitting that step. Inside those 3 days, the date they want to send
this out has been pushed back a couple of times and now, as it enters the step, it is the date
of October 24. As it has now entered the queue for the Wait By Attribute step though,
no changes to the contact data in the data extension will affect the date in this step."," Wait By Attribute and Contact Data Notes"," I know that the previous statement may be hard to follow abstractly,"," so I wanted to try and give it a bit more of an explanation of what I mean."," Contact data: Data that is pulled live from the contact object in Marketing"," Cloud, unlike Journey Data, which is a snapshot at entry."," Wait By Attribute: An activity inside Journey Builder that will wait until the"," criteria you set around a specific attribute in the associated data is met."," So, by using a Wait By Attribute activity and the attribute is associated with"," contact data, this value can change multiple times prior to when it will hit that"," activity. This will adjust the schedule of the email send, right up until the Wait"," by Attribute activity. Once it enters the activity, the data value is then locked in"," and cannot be adjusted.","This is a roundabout way to turn a real-time tool into a scheduled automation tool.
Not exactly best practice, nor does it make a strong case to use this instead of a more
optimized tool. The great part about this though is that Marketing Cloud gives you
options to follow defined best practices or to customize the solution to meet your specific
business needs.
Next, we will explore another unconventional way of turning something real time into
a scheduled process. This one is a bit more straightforward though, but only by a little bit.
\f Real-time versus scheduled 113","Triggered sends/transactional sends (kind of)","I will be honest; this one is not innovative, or elegant, or in any way creative. It is simply","taking one thing and putting it into another and through that, controlling how the first","thing behaves. Not exactly rocket science and it can be a bit clunky.","Basically, you take a Script activity from Automation Studio and, in that activity, you","write up an API call that iterates through your source sendable data and then writes an","API call to either the trigger or transactional endpoint. You then loop through each record","so it will make the call once for each record.","Let's break it down:"," • You would create a Triggered Send Definition email inside of Marketing Cloud.
This would include an email, sending profiles, and an attached data extension.
• You would create a Script activity in Automation Studio.
• Inside this Script activity, you would use SSJS to take the data in the source of your
choice and push that, via the Core Library, an API, or WSProxy, to the Triggered
Send Definition, sending the email.
• You then take that Script activity and push it to a scheduled automation.","This then means that although the trigger or transactional email will be running in real
time, it will only be running in real time at specific times. Now, after you run it, it likely
would make sense to clear or otherwise flag those you have already sent to and ensure you
are not spamming every time the automation runs.
OK, I promise most of the workaround or custom solutions are done now and we can
focus more on best practice usage. Now the focus is mostly on Automation Studio and
Email Studio scheduled email capabilities.","As far as scheduled automated sends go, this is pretty much what you would see if you","looked it up in a dictionary. Before you go grab a dictionary, I can promise it is not","actually there; I was just speaking figuratively to make a point. Although, how cool would","that be? Open up a dictionary and then bam, right there on the page is an image of the","Send Email activity from Salesforce Marketing Cloud Automation Studio!","\f114 Automating Email Sends","Alright, back to the activity. So, basically what this is a defined object that holds each of
the related objects required for an email to send. You can reference Figure 4.2 or see the
following list for this information:"," • Email creative
• Audience
• Subject line/Preheader
• Sender profile
• Delivery profile
• Sender policy
• Exclusion scripts","Each of these objects would be combined into a single object that would be the Email
Send Definition. This definition would then be pulled in via the Send Email activity,
which is activated when the automation runs, which is set via a schedule.
As I said, this is the epitome of automated scheduled email messaging sends. Now, if you
need it all together as an object, but don't want to create an automation, but schedule
it out yourself ad hoc, then you need to check the next part out.","User-Initiated send definitions","So, if you go into the Interactions tab in Email Studio, you will notice something called","User-Initiated Sends. This is basically the exact thing listed earlier, but prior to the","creation of the Send Email activity.","It is the combination of each of the required objects to form an email send. These objects","are combined into a single definition that you can then interact with to edit or execute","the send. This definition can then be pushed into Automation Studio via a Send Email","activity, or it can be manually scheduled. This manual scheduling is very similar to if you","schedule an email via the send wizard in Email Studio, in that you set a time and day and","then it creates a send job.","To note though, if you schedule it this way instead of via Automation Studio, Marketing","Cloud will take a snapshot of the email creative at that point in time and use that for","sending. The email creative will not update (outside data-dependent personalization)","on any changes made to the creative.","\f Real-time versus scheduled 115"," What is a Snapshot?
Inside of Salesforce Marketing Cloud, there is an occurrence that happens
when an email is published (Triggered Send Definition or Journey) or an
email job is scheduled (Email Wizard, Simple Send, or Schedule an Email Send
Definition), or otherwise enters the send queue.
Basically, what happens is that the email code and content are compiled at
a general level (excluding anything related to data) and are then stored inside
the job object. This means that after you have scheduled or published the
email, any content changes you do inside of Content Builder or Email Studio
will be ignored. You will need to either cancel the send and reschedule, pause,
republish, and restart the triggers, or edit and save inside Journey Builder to get
the new content.
For scheduled sends, you can get around this by utilizing a scheduled
automation with an Email Send Definition activity inside of Automation
Studio, as scheduling the automation does not snapshot the email.","The data (sendable and relational) will not be locked in, however, until it hits the send
queue, also known as at time of send. This means you can adjust your data easily to
where you want it prior to it sending out. This comes with the caveat that if you are using
throttling and mess with the data after the initial send queue, you can cause the rest of the
send to error out.
I usually recommend pushing all scheduled sends into Automation Studio as a best
practice as it gives you the most options and flexibility as it removes most of the issues
with this snapshot. Next, we move on to another platform sending solution – the
send wizard.","Email Studio Send Wizard","This one is basically combining the creation of the Email Send Definition with scheduling","into one wizard-led setup. The wizard guides you through each aspect, attaching the","correct objects you need for the email definition, and then, at the end, asks you to define","the date/time you want it sent at.","It is a pretty simple capability and is one that is regularly used by marketers, but I still","recommend utilizing Automation Studio's Send Email activity for the same reason as with","user-initiated send definitions – the snapshot of email content. So many options and so","many possibilities...what is the best choice?","\f116 Automating Email Sends","Real time or scheduled – which is the right choice?","This question depends on what is the top priority for you. Do you need the message to be","out as soon as possible right up in the subscriber's face, or is it better for this to be a more","planned and repeatable process?","For instance, if you have a monthly newsletter that you send out, it would make much","more sense to have this be a scheduled send, rather than real time. Setting a recurring","schedule where there are minimal updates to content and data lets you hit that optimal","send time and burst capabilities that real time cannot offer.","Whereas, if it's a receipt or shipping notification email, the amount of time before the","email is sent is hugely important. In that case, you need to have it be real time. Scheduling","it to be sent out every 30 minutes is not a viable option as most people will expect these","emails within seconds of finalizing the action/transaction.","There are a ton of places where this choice honestly doesn't even matter. However,","whether it's scheduled out or it's a real-time message will play little to no significant value","in the success or failure of the campaign. So, although this can be a majorly important","choice, it also can be fairly insignificant as well in relation to other options.","Realizing there are other questions leads us into a very similar, but also very different,","option. Should you do custom messages in a 1:1 context, or is dynamic, personalized","batch sends a better choice?","1:1 messaging versus batch sends","Very similar to real time versus scheduled, 1:1 messaging versus batch sends are a very","contextual choice. Usually, both of these options correlate together, so if real time is","needed, it usually means that it is best sent as a 1:1 message, while scheduled sends tend","to be better as batch sends.","Before digging too much further into this, we should dig into exactly what each of these","sends is and how they are used. Let's start by taking a look at 1:1 messaging.","1:1 messaging
As defined by Merriam Webster's dictionary, 1:1 means pairing each element of a set
uniquely with an element of another set, and messaging is defined as a communication in
writing, in speech, or by signals. Combining these two, you get a definition of 1:1 messaging
as a communication that pairs each element of itself to an element in a dataset to ensure
a fully personalized experience.
\f 1:1 messaging versus batch sends 117","1:1 messaging is usually synonymous with real-time messaging, which are the two major
elements that make up Journey Builder. To that extent, most of your needs relating to 1:1
messaging will be handled inside of Journey Builder. As stated previously, Journey Builder
is built upon the triggered send definitions of Email Studio, so there are most definitely
other options out there to do 1:1 messaging outside of Journey Builder.","1:1 messaging options","Now, the following list is going to be strikingly familiar to the one you just read through","from the discussion regarding real-time sends. This is because 1:1 messaging options","are nearly identical to the scheduled messaging options, just with a twist.. There is one","major exclusion from this though in that the Automation Studio Send Email activity is","not included:","Although this list seems smaller, the multiple facets of each of these options are
significant. As we have already gone over the basics of these earlier, next we will just dive
into how each of these relate to 1:1 messaging.","This is the easiest, least technical, option you have for 1:1 messaging. Here you can set","up the entire message inside the UI, including the email, the data, and the different","segmentation and splits associated with it. This can be utilized inside a multi-step journey","or a single send journey depending on your needs, but the behavior of the activity is the","same in each context.","\f118 Automating Email Sends","Triggered sends
Previously, I would say that triggered sends were the most popular way to make 1:1
messaging in real time. Journey Builder's upgrades have taken it to a level where the
ease of use and simplicity have led to it taking the top spot. The following is a quick
workflow example:"," Figure 4.3 – Representation of the triggered send workflow","\f 1:1 messaging versus batch sends 119","That being said, triggered sends are still a very powerful and very efficient option.
They require a bit more technical knowledge, but outside of that are pretty easy to
implement and, depending on your needs, they may not require the level of setup that
a journey does by default.","Transactional sends
This is basically a more efficient version of the triggered send. There are a lot more
technical hurdles to it as it requires most of the setup and tracking to be done via the API,
hence it can be more inhibitive and require much more outside of the platform.
To this extent though, it offers things such as webhooks to check the status and tracking
information on individual sends in real time as well as a ton of other very customizable
options to help make your transactional emails as efficient and effective as possible. Here
is a good visual taken directly from the Marketing Cloud website displaying the steps and
capabilities of the transactional messaging API:"," Figure 4.4 – Representation of the Transactional API capabilities","To help make life easier for those who want to utilize transactional messaging, but do not","have the technical skills or resources to set up the APIs and automation necessary for it,","Marketing Cloud created a new Journey type that combines the two.","\f120 Automating Email Sends","This still requires some technical skills to work as you need to have an API event to","initiate, but it allows you to build out almost every other aspect within the UI, making the","process much easier for the average marketer. The following is a brief sample of what","a transactional journey looks like in Marketing Cloud:"," Figure 4.5 – Sample of transactional journey in Journey Builder","Next up, we will dig into the unexpected 1:1 capability of Automation Studio and Email","Studio. These two studios are usually associated more with batch-type email sends. Now","that we know about 1:1 messaging, next, let's explore the alternative – batch messaging.","Batch messaging
This type of messaging has been used in email marketing for years and is usually the way
people think of when they think about marketing emails. Batch, as defined by Merriam
Webster, is the quantity produced at one operation. Basically, this means that through the
one send, we are producing a larger volume rather than a single message.
A common misconception with batch messaging is the old adage of Batch and Blast. This
phrase came about years ago to explain the shotgun approach email marketers used to
take where they would create a single message and then, quite literally, blast it to each and
every person they knew the email address of. It was invasive, annoying, and caused major
disgruntlement with everyone.
\f 1:1 messaging versus batch sends 121","This type of behavior is what brought about the legal movements to protect email address
owners, such as the CAN-SPAM Act. Through this, email marketers had to realize that
email was not just another form of junk mail and that they had to actually spend the time
to build the medium out as a unique marketing channel.
Now, batch messaging done wrong can still be ineffective and invasive, but if it is done
right, it can be invaluable and a highly efficient and faster way to get your message out
there. Batch messaging is more aligned with a repetitive or scheduled messaging strategy,
along with things such as reminder emails, newsletters, weekly promotions, and others.
It also can help to get out an urgent mass message or deal with your audience.","Batch messaging options","So, due to the recent concentration in 1:1 messaging as the preferred method and the","simplicity of what is needed for it, batch messaging options are fairly limited. In the","following, you will see my list of the three current options for sending batch messaging:","One major thing to note is that I left off Journey Builder email activity from this list even
though batch-esque options are now available in Journey Builder. Next, I will dig into why
I left it off.","Why not Journey Builder email activity?","With the creation of the single send journey and the capability of utilizing automations","and data extensions for entry sources, Journey Builder allows you to do batch entry into","a journey. The issue is that as the system was not designed for this intended purpose, what","it is actually doing is just processing each in a 1:1 environment one after the other. This is","not actually a batch send, so I did not include it as an option.","That being said, if you are not as into semantics as I am, it is still a very viable option to","send your message out in a batch context; it just actually isn't a batch send, technically","speaking, just a batch entry. The only other thing to consider around this is things like","throughput and such that we explored in the previous chapters. As a refresher, this is","about the fact that Journey Builder has a more limited batching capability, limited to its","throughput per hour. As it is built to do 1:1 messaging, the speed it can send is affected.","This has been a major focus of Marketing Cloud and they have made great strides forward","in bridging this gap, but it is still worth noting if you are looking to send a high volume","of messages.","\f122 Automating Email Sends","Automation Studio and Email Studio","So, the three messaging options are actually nearly identical. As most of what was involved","and required was stated in the Scheduled section, I am going to combine these three into","a single section to discuss. The following is a sample of what the send screen looks like","inside the Send Wizard in Email Studio:"," Figure 4.6 – Example view of the Email Studio Send Wizard","\f 1:1 messaging versus batch sends 123","These three options are the epitome of what batch email is. This process has not changed
much over the years; it has just been optimized and streamlined. You still need to take
an email and the related elements/objects and attach it to data that is then sent in batches
to a Mail Transfer Agent (MTA), which then pushes it to the send queue where it is then
sent out to the corresponding email address.","How does batch now differ from the 'bad batch' of yesteryear?
So, the previous batch philosophy was to get the same idea out to as many people as
possible as quickly as possible. With email marketing being so cheap, there was no need to
put significant effort into strategy or optimization since, if you got even just one person to
buy at times, it paid for the whole campaign right there. This cheap cost is how the return
on investment (ROI) of email marketing usually sits at something crazy, like 4200% ROI,
which means you get $42 per $1 you spend, as per the DMA in 2019: https://dma.
org.uk/uploads/misc/marketers-email-tracker-2019.pdf.
Although this worked for a while, like I said, new restrictions and laws were implemented
to help save the medium from what would potentially be the end of its use due to the
frustration and overwhelming amount of spam that would come from utilizing email.
These laws actually had an amazing effect on marketing in the medium since, instead
of people being lazy and just copying their direct mail pieces into a digital format and
blasting it out, they had to actually strategize and build campaigns that were specialized
for the medium and the audience in that medium.
This is what saved email marketing from dying, although at first, the extra effort made
people believe that the ROI would decrease and that the channel would die out. But with
the increased relevancy and targeting, the ROI gathered actually stayed the same and,
in some cases, actually increased due to the more positive customer perception of the
channel. This is because when email marketing is done right, it is highly effective and has
many easy lead-ins to a call to action that most other mediums do not have.
So, what made that difference? The difference was that people realized that marketing to
people who actually wanted to hear your message instead of sending messages out for
people to delete or have a bad connotation associated with your name greatly increased
efficiency. And so, from this, permission-based marketing was born.
Although batch sends are not personalized to the level of 1:1 messaging usually, it is best
practice to include at least some custom aspects in them and show the subscriber that you
know who they are and respect their preferences.
\f124 Automating Email Sends","Batch or 1:1? Why not both?","As noted earlier, there are great rationales to each option, and even the opportunity to","combine both. Simply put, the following are good general rules of thumb to follow in","deciding what to use:"," • 1:1 messaging: Use this when you want an individual to do a specific action that
is unique to them or send a unique message based on data to push them to
a specific action.
• Batch messaging: Use this when you have a very specific message you want to send
out to a large audience. It is very intuitively aligned with scheduled automation.
• 1:1 batch messaging: Use this when you want to give unique messaging to multiple
subscribers based on data and preferences but need them all out at the same time.
It is more aligned to being used with scheduled automation, but is also possible via
real-time sends.","Next, I want to jump into how to utilize analytics and tracking data to level up your email
messaging and strategies.","Analytics-powered insights
Any marketing campaign worth its salt is based on a ton of data and analytics. Analytics
is based on the action of analysis, which, as defined by Merriam Webster, means a detailed
examination of anything complex in order to understand its nature or to determine
its essential features.
Essentially, that means the marketer takes all the data and insights and applies those to
the different pieces of the campaign, and works to maximize and optimize it for the best
effect. Although this adds a lot more work and effort to each campaign you send, the
results and added benefits far outweigh that cost.
To that extent, I wanted to bring your attention to where we can grab analytics from in
Marketing Cloud.","Analytics in Marketing Cloud","Most analytics in Marketing Cloud come from external sources or via built-in reporting.","Due to some recent purchases at the time of writing, these potential built-in analytics","capabilities offer some highly valuable analytics platforms. Some of the platforms I list","here are not exclusively analytics but can offer insights and relevant data on the subscriber","to help or support analytics.","\f Analytics-powered insights 125","Here are some examples of Salesforce owned and highly popular options for
Marketing Cloud:"," • Tableau
• Datorama
• Analytics Builder
• Interaction Studio
• Salesforce CDP
• Customer 360
• Google Analytics Connecter","Rather than go into each of these options, I am going to be exploring how best to use the
built-in data and capabilities of analytics to power up your email marketing.","Tracking data
Ever wonder which email version has the most conversions or is the most popular? When
is the time people usually open or interact with your emails? Inside Marketing Cloud,
you can use the data views inside Email Studio, Tracking Extract options, and even some
native reports to gather all this information.","Data views
Data views are available for the user to interact with via SQL in Automation Studio
or via Measures in Email Studio Filters. I would never recommend Measures since,
in my experience, they are unreliable and very confusing. Avoid these if possible.
Data views offer information on many different aspects. Much of the focus of this is
centered on email and subscriber interaction with email sends, but there are other options,
such as SMS and push, that are unrelated to email. Here are a few examples of data view
tables available to query:"," • Sent (_Sent)
• Open (_Open)
• Bounce (_Bounce)
• Click (_Click)
• Unsubscribe (_Unsubscribe)
• Complaint (_Complaint)
\f126 Automating Email Sends"," • Business Unit Unsubscribe (_BusinessUnitUnsubscribe)"," • Job (_Job)"," • Subscribers (_Subscribers)"," • List Subscribers (_ListSubscribers)"," • Enterprise Attribute (_EnterpriseAttribute)"," • Journey (_Journey)"," • Journey Activity (_JourneyActivity)","As you can see from these options, there is a lot of different relational points of
information you can use to build your own custom reporting and analytical discoveries.
This option requires a lot more technical knowledge and outside interaction to be able
to find the required information you need. A less customized and detailed option is the
built-in reporting and tracking capabilities of Marketing Cloud.","Built-in reporting and tracking","Inside Marketing Cloud, there is a studio that is dedicated to this exact thing – Analytics","Builder. This suite of tools offers insights into both Datorama Reporting as well as Web","and Mobile Analytics. Previously there were some other built-in reports called Discover","reports, but these have been discontinued and are no longer in use, or are only available","for a limited time.","As we are focusing on email, the part of this we will be concentrating on is Datorama","Reports. So, although Datorama reporting is built in, there are also some tiers to the level","of capability this reporting can offer. The basic reports provide pretty strong insights","similar to what was available in Discover reports, but the higher tier you pay for, the","more information and insights you can get. The final tier of this offer is basically a full","implementation of Datorama integration with Marketing Cloud. The following is a sample","of the dashboard of Datorama Reports:","\f Analytics-powered insights 127"," Figure 4.7 – Example view of the Datorama Reports overview screen","Datorama Reports will give you access to analyze your data and receive intuitive","dashboard views to visualize your aggregated data. These dashboards and reports","can be interacted with and manipulated via elements such as filters, sorting, summation,","and organization.","The next part that is built-in is the Tracking tab inside Email Studio. This tab offers a few","different options, but honestly, the only one to pay attention to is the My Tracking section","as the rest are more legacy options and will likely be disappearing in the near future. The","tracking section holds a list of all the send jobs you have pushed out and all the tracking","data (opens, clicks, and bounces) that are associated with that job. Refer to the following","screenshot for an example of this in the UI:"," Figure 4.8 – Example of the Tracking tab inside Salesforce Marketing Cloud
\f128 Automating Email Sends","Then, when you open up a specific send job on this screen, you will see be taken
to a dashboard like this:"," Figure 4.9 – Tracking information for a send job inside the Tracking tab
This offers a ton of information aggregated together for a simple review of a single send
job. It is great for a specific review, but when looking beyond a single send context, this is
very limiting and could require a ton of manual effort to utilize.
Next up is the Journey Analytics capabilities. There are some limitations to this, including
a limited date range (around 90-day range) and no way to drill down beyond the
high-level engagement metrics. This includes total sent, delivery rate, unique opens,
unique clicks, and unsubscribes only. This is good for a quick overview, but is only
designed for the overall journey and will not display per email.
To view an email, you can open the journey and click on the email you want to view and
there will be a more focused version called Email Analytics that gives essentially the same
info as you saw in Journey Analytics, but only in relation to that single email.
If you happen to already own or have an existing analytics platform and do not need to
use anything built in or purchase anything new, you will need to find a way to push all this
data into the platform. This is where Tracking Extracts comes into play.","Tracking Extracts
To get the information from Marketing Cloud to your external analytics tool, you will
need to extract it from the platform and then push it to a location that can allow
that tool to ingest it. The most popular way to do this is by using Marketing Cloud
Tracking Extracts.
\f Considerations regarding email automation 129","This is a built-in export process that will output a ZIP file containing CSV files. In turn,
these CSV files contain relational files that contain all the data you requested when you
built the extract. The ZIP file can be pushed to any file location you have listed in your
file locations in Marketing Cloud, allowing you to push it to an external FTP for your
analytics tool to ingest.
The available information includes everything you see inside of the data views, as well
as a few other options such as NotSent and Impression region tracking. There is a lot of
information here that you can only get through the SOAP API objects or that you might
not even be able to find anywhere else.
Now that we have all the information and background on automating your email sends,
I wanted to share some of my personal experiences and considerations on it. I hope that
this information will give some context to allow you to relate this information to your
situations and allow you to better utilize it.","Considerations regarding email automation","In this section, I wanted to share some considerations when using email automation that","may help give some insights and context to help you in the future:"," • When creating an automation, you should make the email content as dynamic as
possible (pulling in via AMPscript or lookups or referencing other blocks) to allow
for easy changes, updates, and implementation of new content. Do note that with
Journey Emails, Triggered Emails, and Transactional Emails, you will need to pause
and republish the trigger in order for it to work.
• Along with the email automation, you should utilize either an import process
or SQL queries to update and prepare the sendable data as part of the same
automation or prior to entry at the least. Relational data that is used should be
updated prior to sending and made sure to be completed beforehand as well to keep
it as up to date as possible.
• Creating a journey for drip campaigns or other series of emails can be an amazing
asset and help create great customer experience, engagement, and retention, but,
along with that, you cannot just set it and forget it; you need to constantly monitor
it and update or maintain it as necessary. You need to collect as much tracking from
each of your automations as you can to help fuel future strategies and ensure your
campaigns are optimized.
\f130 Automating Email Sends"," • Including coupons and discounts inside of automated emails will require careful
monitoring to ensure that you do not run out of coupons and that it is giving the
correct information to ensure customer satisfaction. This is especially true if you are
using the ClaimRow() function in AMPscript.
• For sends that are required at a specific time, doing a batch send is optimal. But,
when the amount of time that is required between a trigger and message execution
is minimal, then you need to utilize a more real-time method.
• The more emails you automate, the more you need to also consider how many times
you are contacting your audience. You will want to insert a process to ensure you
are not accidentally sending too many emails to your audience, causing them to
unsubscribe or send spam complaints about you.
• Ensure the data you collect and use in your automation is clean, accurate, and
correctly prepared. If you were to collect data based on user input that was not
cleaned or properly validated and use that for your sends, you could be accidentally
sending out malformed or incorrect emails with the possibility of even having
the whole automation error out. This is aligned with the garbage in, garbage
out philosophy.
• You will want to have a strong preference and subscription center for your
customers to work with to ensure that they can control which publications they feel
are appropriate and also to help ensure you engage with them in a pleasing manner
and not force them to unsubscribe, removing them from your marketing.
• Automated emails are not just handy for upselling, but also for cross-selling,
opt-downs, opt-across, and similar actions. The more flexible and interactive you
make your strategy and automations, the more likely you are to find the product
or service that the customer wants, thereby enabling better ROI.
• Marketing Cloud is not designed to do heavy tasks. So, when using a higher
processing draw and heavier automation, it can lead to slowdowns or time-outs
in your instance. You will need to ensure that you keep this in mind as you build
this and ensure that you utilize your automated processes efficiently in order
to be effective.","There is a lot more that could be written here, but the preceding should help get you to
a great place. Much of the best considerations are gathered through personal trial and
error. I would highly recommend that you take the previous points as a basis and use that
to test and explore the limits of your own automations and strategies.
\f Summary 131"," Garbage in, garbage out"," Basically, this phrase means that you receive something of the same quality"," as the time, effort, and materials you put into it. So, if you throw garbage into"," a machine, no matter how elegant and awesome that machine is, what comes"," out will still be low quality.","Summary
Email automation is one of the core aspects of all Marketing Cloud automation and one of
its strongest capabilities. After reading this chapter, you should have a strong knowledge
of how email marketing automation works in Salesforce Marketing Cloud, as well as some
in-depth information and insights into it.
You should now be able to tell the context of when to use real time instead of a schedule
and vice versa, as well as 1:1 versus batch. Now, an aspect that you have also learned,
although not explicitly set out as a section, is the difference between real-time messaging
and 1:1 messaging, as well as the difference between scheduled messages and batch sends.
At times, these can seem synonymous, but as you now know, they are different.
Now, knowing the messaging types and best uses is a huge part of email automation, but
the other major aspect is data and analytics. As you have now learned, a campaign without
analytics and tracking is like going on a blind date while actually wearing a blindfold,
meaning that you might luck out and do well, but since you cannot see anything, it could
go terrible very quickly and easily. On top of all that, we also went over some of my
personal tips and tricks to keep in mind regarding email automation. These considerations
are meant to help to contribute to a strong understanding of email automation in
Salesforce Marketing Cloud and the best utilization of the available tools within it.
With the knowledge gained here, you should be able to increase your marketing
capabilities with regard to email marketing by quite a bit inside Marketing Cloud. Email
marketing automation can be easily applied in many different situations and prove
extremely useful, especially in the forward-facing thoughts of building scalable solutions.
In the next chapter, we will be digging into the meat and potatoes of automation: the
automation of your ETL and data processes. These processes are key to providing a strong
and solid base for all your marketing needs and ensuring that your strategies are possible.
Data automation is essentially the bread and butter of what automation is made for.
We will be discussing ETL processes via SQL Query Activities, Import Activities, Export
Activities, Filter Activities, and more for segmentation, exclusion, and data staging.
\f 5
Automating Your
ETL and Data
Emails and other messaging options are cool and all, but without data, it's really nothing
more than fancy words and pretty pictures. Data is integral when it comes to marketing.
Without data, we do not have an audience to send to nor any idea who or what we should
send. Although messaging and email automation may be the more popular aspects, this is
mostly because no one thinks about the data part as it is not as fancy or shiny as the cool
subject lines, emojis, images, or designs.
To that extent, in this chapter, we will be exploring all things related to automating data
in Salesforce Marketing Cloud, with a focus on ETL and Automation Studio. We will dive
into topics surrounding data, including introductions into what ETL means as well as
specific activities inside Marketing Cloud that can allow us to automate our ETL and data
processes. The following is a quick list of the topics that we will discuss:"," • What is ETL?: An exploration into what ETL is and how it's used in conjunction
with automation in Marketing Cloud.
• Activities for data in Automation Studio: An exploration into the automation
activities available for data manipulation, importing, and suchlike inside Marketing
Cloud Automation Studio.
• SFMC SQL Query activities: Here, we dive into the activities inside Marketing Cloud
that utilize SQL and some best practices regarding their use.
\f134 Automating Your ETL and Data"," • Filter activities and data segmentation tools: We move forward into the UI drag and
drop tools and capabilities centered around segmentation that require little to no
technical knowledge.
• Importing data options: A dive into the options available to bring data inside the
Salesforce Marketing Cloud platform.
• Exporting data options: An exploration of all the different methods and possibilities
in terms of extracting or exporting data from Marketing Cloud.","You will remember that ETL (Extract, Transform, Load) was brought up before in
Chapter 2, SFMC Automation Tools, in reference to Automation Studio. This provided
a great basic overview of what ETL is and how it's used in Automation Studio, but
this chapter will focus on ETL and data in general. I wanted to give a deeper dive into
exactly what ETL is with the aim of helping to provide a strong basis in terms of data
management, manipulation, and storage inside the Marketing Cloud.","What is ETL?
ETL has its roots in the rise of central data repositories. With the dawn of data warehouses
around the 1990s, tools began being made specifically focused on extracting data from
siloed systems, transforming it into the destination format, and then loading it into
the new destination (or ETL). Over the years, ETL has grown to become stronger and
stronger with the increase in demands during the data age of marketing.
ETL tools typically do all three of the steps and are a critical part of ensuring that data is
prepped completely and accurately for things such as reporting, analytics, and other
data-driven actions, including machine learning. The following is a basic definition of
each of the three steps in ETL:"," • Extract: Retrieving and gathering data from siloed or individual data sources
• Transform: Manipulating and altering data to fit inside the proper format/structure
• Load: Taking the final data and pushing it into the target system, database,
and data mart","Through ETL, you can take data from one environment and transform it to better
match the existing formatting and structure of your other data for easier use, reporting,
or analysis. ETL also makes it possible to migrate data between a variety of sources,
destinations, and analysis tools. This makes it a very critical aspect in producing business
intelligence and in the execution of broader data management strategies. Let's go into
a little more detail on each of these three steps, starting with extract.
\f What is ETL? 135","Extract
The first step is to extract or retrieve the data from the source. Although this is certainly
not the most glorious or flashy part, it is most definitely the most important aspect of
ETL. Why? Well, because without any data, you have nothing to transform or load.
So, an improper extract will ruin everything.
Now, a misconception is that the extract will only come from a single source. This is not
always the case; usually, you will find that the majority of ETL processes will pull from
multiple sources with the idea of combining or otherwise being manipulated into an
easier-to-read dataset. This can be from relational data sources or non-relational sources,
including many different structures and formats. This leads to the next step, transform,
which takes this big mess of raw data and makes it meaningful.","Transform
Now that we have all the data, we need to manipulate and restructure/reformat the raw
data we have. The number of possibilities here is really awe-inspiring. Depending on
the capabilities of your tool and the data you extracted, the sky is the limit. Refer to the
following list of some of the major potential actions in the transform step:"," • Aggregation of data (sum, average, and so on)"," • Transposing or pivoting of data (columns into rows or rows into columns)"," • Splitting or creating delimited lists"," • Joining of data"," • Filtering, cleansing, and deduplication"," • Sorting or ordering"," • Calculations of new values"," • Translating coded values for easier human reading"," • Limiting the columns to be used and removing those that are not necessary for"," the target"," • Data validation and auditing"," • Removing, encrypting, or protecting sensitive information"," • Adjusting formatting and the structure to match the target's schema","\f136 Automating Your ETL and Data","This step is where the magic happens and can take a load of raw material and turn it into
an amazing work of art. Depending on the required actions though, this can also be the
largest process and time draw, so you need to make sure to keep efficiency and simplicity
in mind while planning your transforms. Once you have your beautiful data, you now just
need a place to put it.","Load
In this last step, the data that we have manipulated and transformed is now moved from
the staging area where we made our changes and pushed into the end target. This can be
any data store, including a flat file or system or a data warehouse. The load can be ingested
in different ways, but in general, it is usually one of the following three options:"," • Overwrite: Completely removes all existing data in the target first, and then loads in
new data.
• Append: Adds all new data to the target, creating a new record for each row. If there
are duplicates with existing records, this can cause an error. It does not require
a primary key(s), but without a primary key, this can lead to duplication.
• Add and update: Adds all new data to the target, creating a new record for each
row, but if there is an existing record, it will update that row with the new data. This
requires a primary key(s) to be defined.","As simple as this sounds, depending on the requirements of the organization and the
system, this process can vary dramatically. This can include frequency, size limitations,
timeout limits, logging, and historical audit trails. In the load, all those matching the
schema in the target will be compared to the transformed data, and if they do not match,
it could throw an error either for the whole process or for that individual record.","How is ETL used?","Traditionally, ETL tools are utilized to combine structured and unstructured data","from different sources with the goal of loading them into an end target. Through this","combination of data into new structures and formats, insights that might have otherwise","been hidden can surface through your reporting or analytics. This is also useful for","migration from a local system to the cloud, or can even be used to connect partners","or vendors in a single system.","ETL can also be used to improve data quality by means of the standardization and","automation of processes that move data from a source to a targeted data warehouse.","This reduces the likelihood of dirty data being passed to the warehouse.","\f Activities for data in Automation Studio 137","One aspect that is important as regards Salesforce Marketing Cloud is the ability to draw
out data and filter and segment this data and reduce columns to create for yourself
a sendable audience for each send from a large master list. This capability allows for
much better efficiency, customization, and segmentation of your email campaigns.
That being said, SFMC is not designed to be an ETL system and is not to be utilized
for major or frequent ETL needs. It is optimized for quick segmentation requirements
and more tactical-type manipulations, but once you get to transforming or otherwise
interacting with large volumes of data or high-frequency interactions, it tends to fall flat.
To this end, there are limitations that need to be considered for these types of activities
within the platform and you should consider doing these actions prior to bringing them
into SFMC or inside an external tool that is designed for these actions. This will make
your process much more stable and efficient. Now, let's dig into how we can accomplish
these ETL actions inside Automation Studio and Salesforce Marketing Cloud.","Activities for data in Automation Studio","As you may notice, I am exclusively mentioning Automation Studio for ETL capabilities","in Salesforce Marketing Cloud. There are some other capabilities in other studios","or apps related to it, but they are much more simplified versions with very little power","or customization to them.","For instance, some would argue that the filters, measures, and groups inside Email Studio","are great examples of ETL and data manipulation, and they would be right … to an extent.","Although these are great functionalities, they are better passed in as more support-type","functionalities instead of being defined directly as they are not easily automated.","The same is true for some actions and possibilities in Journey Builder, and although those","are closer to automation and being powerful, a lot of it is just using a tool in a way it was","not intended to be used instead of just using Automation Studio in the first place. So,","rather than potentially sharing bad practices and habits, I am going to stick to Automation","Studio, which is designed for this type of thing.","Inside Automation Studio, there are very specific activities that are utilized for powerful","data manipulation and ETL processes. These activities are unique to Automation Studio","and are not easily replicated inside any other place in Marketing Cloud. The following is","a quick list of the Automation Studio activities that are related to ETL:"," • SQL Query activity"," • Import File activity"," • File Transfer activity","\f138 Automating Your ETL and Data"," • Filter activity
• Data Extract activity
• Tracking Extract activity","You may notice that I do not have the Script activity listed there. This is deliberate
as although the Script activity can do some awesome things in the same way as ETL
processes, it is done indirectly via a script and not directly via an ETL process. So, think
of Script activity as ETL-related, but not a direct ETL activity.
As we progress in this chapter, I will be diving deeper into each one of these activities
and how they relate to data and ETL automation in the Salesforce Marketing Cloud.
This section is a quick introduction to the activities and to ETL in Automation Studio in
general. Some of the other activities may also be related to ETL, just not as strongly
or directly as these. For instance, a couple of the refresh activities that rerun filters on
lists are related to ETL, but are very limited in usage and capability.
For the majority, each of these activities only handles some aspects of ETL, whether
it's just the extract part or the load part. As I go through each in the sections that follow,
I will be sure to mention what steps each one is capable of doing. To start things off, we are
going to go with the strongest data manipulation activity in all of Automation Studio and
all of Salesforce Marketing Cloud – the SQL Query activity.","SQL Query activities","Now, you may well be thinking, \"Hey, SQL is not ETL, it's a query language … that is","structured.\" Well, you would be 100% correct and would also have awkwardly explained","what the acronym SQL stands for (Structured Query Language). However, it seems odd","to start things off after all the ETL talk with something that is not really directly ETL,","right? Maybe a little, but I promise that there is a method to my madness here.","With the background we have in basic ETL, we can now jump right into the possibilities","and power of SQL Query activities, which are, in my opinion, the most powerful and","capable data manipulation option in all of Marketing Cloud. It can be a love/hate","relationship, but there is no denying the power these activities bring to you as the","end user.","So, why do we want to jump in right away? Why not structure it with all the ETL topics","first and then end on SQL? Well, in all honesty, it's because I am excited and wanted","to write this section as soon as possible! Oh, and because understanding the different","manipulation and querying capabilities of SQL will make for easier understanding in","some of the following activities and options.","\f SQL Query activities 139","Also, at least in my opinion, SQL Query activities in Marketing Cloud are essentially
doing all three steps of ETL each time they are run. Let's take a look at the steps:"," 1. Write out SQL that has data sources listed (FROM), or Extract.
2. Write out the SQL that defines relationships, formats, data types, manipulations,
combinations, or Transform.
3. It then pushes this new data into a custom object, data extension, inside the
platform, or Load.","Well, look at all these three steps. Sounds like an ETL process to me! Now, is it the most
efficient or powerful way to accomplish this? Nope, not at all. However, it does offer great
agile ETL capabilities in a highly customizable way.
Although Marketing Cloud SQL Query activities are limited to just utilizing SELECT
statements (queries), which takes away a good portion of the language, those query
statements have a ton of different functions, utilities, and capabilities that can handle
any need you would have that would reasonably be expected from a non-database and
non-analytic tool.","What is a SQL Query activity?","As defined in the official docs, a SQL Query is an activity that retrieves data extension","or data view information matching your criteria, and then includes that information in","a data extension (https://help.salesforce.com/s/articleView?id=mc_","as_using_the_query_activity.htm&type=5&language=en_US). Basically,","the Marketing Cloud SQL Query activity is a way to execute SQL statements inside the","platform pulling from data extensions and data views to fill a target data extension.","As a note, data views are not targetable or able to be edited by the user.","Marketing Cloud SQL Query activities run T-SQL on Microsoft SQL Server 2016,","or at least more or less. There are some aspects of SQL Server 2016 that do not work","in Marketing Cloud, but it is a good general guideline to establish what functions and","capabilities are possible in a SQL Query activity.","\f140 Automating Your ETL and Data","The following screenshot shows us what this activity looks like inside Marketing Cloud:"," Figure 5.1 – Example of the UI for a Query activity","There are a ton of different considerations, syntax requirements, and known issues related","to SQL Query activities. Listing them all would probably constitute a book in itself, and","it is already pretty well documented on the official docs or on other community blogs","or developer forums such as Salesforce Stack Exchange, so I will not be including those","here. Instead, I will be concentrating on uses for the SQL activity and its relation to data","automation in Marketing Cloud.","Uses of the SQL Query activity in automation","If you go up to the previous Transform section and view that list there, that is pretty much","everything you can do for the query activity. In my opinion, this can be shortened into","three different categories:"," • Data segmentation and preparation for sending"," • Custom reporting and transforming of data in preparation to export","These three categories have many different subcategories that fit within them, but I feel
that is a pretty good broad definition of the capabilities of the SQL Query activity in
relation to data automation.
\f SQL Query activities 141","Data segmentation and preparation for sending","This category is fairly broad as it encompasses all manipulation of data with the intent of","preparation and segmentation for sending a message. This could be something as simple","as breaking apart a master data extension to specific sendable audiences for your email","campaigns, all the way to creating relational data that is used in your emails to properly","customize your emails with the most up-to-date information.","Even something as simple as splitting an audience into sendable segments can become","astoundingly complex depending on your needs and wants. With the 30-minute timeout,","your query has the potential to need to be split into multiple queries with intermediary,","or staging, tables due to complexity and volume. This is on top of the potential need","for multiple joins and stringent WHERE conditionals, as well as aggregate calculations","of information.","Here is a great example of segmentation from a master data extension:","Email newsletter audience","First, you would need to create a target data extension with the following information:"," • Data Extension Name: MonthlyNewsletter"," • Data Extension External Key: MonthlyNewsletter"," • Data Extension Fields and Properties:"," Table 5.1 – Fields and properties of the MonthlyNewsletter data extension","\f142 Automating Your ETL and Data","Then, you would create a SQL Query activity in Automation Studio:"," • Query Name: MonthlyNewsletter"," • Query External Key: MonthlyNewsletter"," • Target Data Extension: MonthlyNewsletter"," • Data Action: Overwrite"," • SQL:"," SELECT CustID as SubscriberKey,"," Email as EmailAddress,"," FName as Firstname,"," LName as LastName,"," FavStoreNum,"," Newsletter_Subscribe as NewsSub"," FROM [MyMasterDE]"," WHERE FavStoreNum IS NOT NULL"," AND Newsletter_Subscribe = 1"," AND (Unsub = 0 OR Unsub IS NULL)","This will output a new data extension with only the audience that matches the criteria
we listed. This criterion is looking to ensure the following:"," • That they have a favorite store (FavStoreNum) listed"," • That they are subscribed to the newsletter (NewsSub)"," • And that they are not unsubscribed (Unsub)","This will reduce our master data down to the specific audience we want to send to,
allowing us to prepare and transform the sendable audience easily in the platform.
Speaking of transform, you will also note that I am adding aliases to the fields from
MyMasterDE for the target data extension to allow for an easier-to-understand and
more consistent naming convention for our sendable audience. Next up, we will explore
preparation for exporting through transformation.
\f SQL Query activities 143","Following are a couple of reference screenshots to help give context inside Marketing Cloud:"," Figure 5.2 – Screenshot of the MonthlyNewsletter data extension","As you can see in the preceding screenshot, all the fields, the external key, and name","are filled in the correct locations inside the Marketing Cloud MonthlyNewsletter data","extension. Next, let's take a look at the Query activity:"," Figure 5.3 – Screenshot of the MonthlyNewsletter Query Activity overview","\f144 Automating Your ETL and Data","The Query activity sample shows the final SQL stored inside the activity along with the
correct target, data action, name, and external key. Next up is a look into the options for
custom reporting in Marketing Cloud SQL activities.","Custom reporting
My definition of custom reporting is a grouping of data for tactical analysis and oversight,
usually created and stored in a data extension and run periodically. This can include
something like a report on Recent Click Activity in the past 7 days or Overview of all
email campaigns sent, or any of a hundred other things, including custom KPI reports.
Usually, custom reporting is utilized directly in the platform, exported out as a file,
or iterated through and displayed in an email or cloud page. As mentioned earlier, these
tend to be more focused on tactical analysis and oversight and less on the broader strategy
analytics. The higher-level analysis is usually done in a separate analytics tool that views
things from an enterprise level instead of just focusing on data available in the Marketing
Cloud only.
Following this, I am going to share an example of custom reporting SQL activities to help
give further context on what this means.","Recent click activity in the past 7 days","First, you will need to create a target data extension, with the following information:","Data Extension Name: ClickPast7Days","Data Extension External Key: ClickPast7Days","Data Extension Fields and Properties:"," Table 5.2 – Fields and properties in the ClickPast7Days data extension","\f SQL Query activities 145","Then you would create a SQL Query activity in Automation Studio with the
following details:
Query Name: ClickPast7Days
Query External Key: ClickPast7Days
Target Data Extension: ClickPast7Days
Data Action: Overwrite
SQL:"," SELECT j.JobID,
j.EmailName,
s.EmailAddress,
c.SubscriberKey,
c.LinkName,
c.LinkContent,
c.IsUnique,
j.DeliveredTime as SendTime,
c.EventDate as ClickTime
FROM [_Job] j
LEFT JOIN [_Click] c
ON j.JobID = c.JobID
LEFT JOIN [_Subscribers] s
ON c.SubscriberKey = s.SubscriberKey
WHERE c.EventDate > dateadd(d,-7,getdate())","This will then output a dataset of people who have clicked on an email in the last 7 days.
This list will show click information at the Link level, meaning if the person clicked
multiple times on the email on multiple links, each one of those clicks will be in
this report. Next up, let's take a more in-depth look into data segmentation and
send preparation.
\f146 Automating Your ETL and Data","In the following screenshot, you can see what a ClickPast7Days data extension screen
looks like:"," Figure 5.4 – Screenshot of the ClickPast7Days data extension","As you can see, this pulls in the correct fields, names, and external keys, as we listed","previously. Now that we have seen an example of the target data extension, we should","validate the Query activity. This next screenshot shows the completed SQL Query activity","in Marketing Cloud:","\f SQL Query activities 147"," Figure 5.5 – Screenshot of the ClickPost7Days Query activity overview","The Query activity sample shows the final SQL stored inside the activity along with the","correct target, data action, name, and external key. Once you click Finish here, it will","finalize this Query activity and let it be used in automations or a singular action which","is run once. Next, we will be diving into transforming data in preparation to export.","Transforming data in preparation to export","This final category is focused on exporting data. Now, you may ask, why would there be","a category for export? Well, that is because most places will export data to an analytics tool","or data lake outside Marketing Cloud where they will take a more enterprise view and not","the tactical view we get inside Salesforce Marketing Cloud.","However, the data we have in Marketing Cloud is not always in the same format","or structure as the target, so there will need to be a lot of transforming and massaging","of data. This is probably one of the most important categories as this is usually vital to","your KPI measurements and analytics beyond just email marketing and messaging from","Marketing Cloud and, if it is set up incorrectly, can corrupt all of this, leading to massive","failures in strategy due to this inaccurate data.","\f148 Automating Your ETL and Data","A good portion of this is likely to be focused on the tracking data (Opens, Clicks, and
Bounces), which can easily be grabbed via data views or through tracking extracts. So,
why would we need a category on SQL queries for something fairly simple? Because,
although gathering the info is easy, preparing it for export is not. You may have to
do some fancy footwork through joins or apply as well as change the data types and
calculations to get the information to a place that fits in your target.
For instance, the following section takes tracking information that we have already
gathered and combines this into one giant bulk file with all the most recent tracking dates
that can be exported to your data lake.","Tracking data exports to the data lake","First, you need to create a target data extension with the following information:"," • Data Extension Name: DataLake_Extract"," • Data Extension External Key: DataLake_Extract"," Table 5.3 – Fields and properties of the DataLake_Extract data extension","\f SQL Query activities 149","Then you create a SQL Query activity in Automation Studio:","Query Name: DataLake_Extract","Query External Key: DataLake_Extract","Target Data Extension: DataLake_Extract","Data Action: Overwrite","SQL:"," SELECT s.AccountID,
s.JobID,
s.ListID,
s.BatchID,
s.SubscriberID,
j.EmailName,
j.DeliveredTime as SendDate,
sub.EmailAddress,
sub.Status,
s.EventDate as SentDate,
o.OpenDate,
c.ClickDate
FROM [SentDV] s
LEFT JOIN [JobDV] j
ON s.JobID = j.JobID
LEFT JOIN [SubscribersDV] sub
ON s.SubscriberID = sub.SubscriberID
CROSS APPLY(
SELECT TOP 1 op.JobID,
op.ListID,
op.BatchID,
op.SubscriberID,
MAX(op.EventDate) as OpenDate
FROM [OpenDV] op
WHERE op.JobID = s.JobID
AND op.ListID = s.ListID
AND op.BatchID = s.BatchID
\f150 Automating Your ETL and Data"," AND op.SubscriberID = s.SubscriberID"," GROUP BY op.JobID, op.ListID, op.BatchID,"," op.SubscriberID"," ) o"," CROSS APPLY("," SELECT TOP 1 cl.JobID,"," cl.ListID,"," cl.BatchID,"," cl.SubscriberID,"," MAX(cl.EventDate) as ClickDate"," FROM [ClickDV] cl"," WHERE cl.JobID = s.JobID"," AND cl.ListID = s.ListID"," AND cl.BatchID = s.BatchID"," AND cl.SubscriberID = s.SubscriberID"," GROUP BY cl.JobID, cl.ListID, cl.BatchID,"," cl.SubscriberID"," ) c"," WHERE s.EventDate > DATEADD(day,-7,s.EventDate)","This one is a bit of a doozy and can get complicated. There are quite a few different
approaches and thoughts on making this type of extract as well as finding the most
performant way. As a note, even though this example uses a 7-day period, if you have high
volumes of data, then this can time out at the 30-minute runtime max."," Did You Know?"," Using online resources or even a quick Google query on a SQL function can get"," you detailed information, not just on what the function is, but examples and"," best practices on how to use it.","\f SQL Query activities 151","After completing this query, you would then do a data extract to push a flat file to the
SFTP location of your choice or utilize an API to grab it directly from Marketing Cloud
and push it directly into your data lake.
Following are a couple of reference screenshots to help provide context inside
Marketing Cloud:"," Figure 5.6 – Screenshot of the DataLake_Extract data extension","\f152 Automating Your ETL and Data","As you can see in the preceding screenshot, all the fields, external keys, and names are
filled in the correct locations inside the Marketing Cloud DataLake_Extract data
extension. Let's take a look at the Query activity next."," Figure 5.7 – Screenshot of the DataLake_Extract Query activity overview","The Query activity sample shows the final SQL stored inside the activity, along with","the correct target, data action, name, and external key. Now that we have a pretty solid","understanding of SQL, we should take a brief moment to review SQL.","SQL summary
As you can see from the previous section, there are a lot of uses for SQL and, in all
honesty, if you are doing any transforms to data, then 9 out of 10 times, you will wind up
using a SQL Query activity to do it. It is the most efficient way of transforming data and
with the lowest processing requirements of all the available scripts and UI tools available.
\f Filter activities and data segmentation 153","SQL Query activities are also easily automated as they fit right into Automation Studio
and can be used in combination with many of the following activities to make an
invaluable automated process. During our discussion on SQL so far, we went over the
three major categories of what SQL can accomplish:"," • Custom reporting
• Data segmentation and preparation for sending
• Transforming data in preparation to export","These three categories are a good umbrella of the possibilities afforded by SQL, but inside
these categories are hundreds of subcategories that can range from simple to astoundingly
complex. So, although only three are listed, the usefulness and capabilities of SQL are
far beyond this explanation. Feel free to review sites such as W3 Schools (https://
www.w3schools.com/sql/default.asp) or the official Microsoft docs on T-SQL
(https://docs.microsoft.com/en-us/sql/sql-server/?view=sql-
server-ver15) to get more details and an in-depth understanding of SQL capabilities.
Now, although SQL is pretty darn great in Marketing Cloud, especially regarding
automation, there are other options available for us to use. Most of these focus on
a specific aspect of ETL, or are not able to be automated, or at least automated easily.
For instance, in our next section, we will go over Filter activities and other data
segmentation, including groups and mobile lists.","Filter activities and data segmentation","So, not all of us have the development and technical knowledge to write SQL and that","is more than OK! Marketing Cloud helps account for that with things such as Filters,","Groups, and filtered mobile lists. There also are things called Measures, which can be","used inside filters and are intended to be Drag and Drop SQL, but, in my opinion, they","are too unstable and potentially inaccurate to be useful. To that extent, I will not be","mentioning much about them here.","As you may or may not know, each of the three options is specific to a data source;","for example:"," • Filters: Data extensions"," • Groups: Lists (in Email Studio)"," • Filtered mobile lists (in Mobile Studio)","\f154 Automating Your ETL and Data","To that extent, you will notice that the UI tools and options are all specialized and, in
some cases, Groups and Mobile Lists, are not possible in SQL queries. The closest option
to SQL queries would be Filters, which is where we will start our discussion.","Filters
This option actually has a few different pieces that make it up. Our focus is going to be
on the Filter activity, but this will also include things such as one-time use filters in UI,
Filter refreshing options, and the differences between each.
Before we get into the Filter activity, I am going to explore the other options first.
Let's start with the one-time use filters in the UI.","One-time use filters","This exists inside Email Studio and can in no way be automated. This can still be useful for","automations with its capability for refreshing through an API in scripts. But let's not get","ahead of ourselves; more on that later.","So, what the heck is this? A one-time use filter is done when you go to Email Studio","and use the Filter option, which looks like a funnel on the right side of the row, on the","data extension of your choice. From there, you drag and drop the filter criteria you want","and then click Save & Build to create your new filtered data extension. The following","screenshot shows the drag and drop interface for creating filtered data extensions:"," Figure 5.8 – Drag and drop segmentation interface for filtered data extensions
\f Filter activities and data segmentation 155","In order to create the filtered data extension, it will require you to automatically create
a new data extension. So, after clicking Save & Build, you will need to provide a name,
external key, description, and folder location where you want the data extension to reside.
After all that is completed, you will find your data extension in that folder. It will be very
similar to the normal DE, but you will notice on the right-hand side that it now has this
weird arrow circle symbol instead of the normal import and filter symbols."," Figure 5.9 – Example of a filtered data extension in UI","So, this arrow circle symbol is actually the symbol to refresh. This is how you would","be able to rerun the filter criteria on the DE. Now, this is literally the only way to update","these data extensions unless you are comfortable with undocumented REST API","endpoints (again, more on that later). This makes these great for quickly debugging","or troubleshooting, but pretty bad for any automation or repetitive tasks.","So, some of you may have noticed that in Figure 5.1, there is a button named Save As","Filter under the Save & Build button. This will actually take your criteria and create","something called a Filter Definition. This does not execute the filter, but instead saves all","the aspects of it inside an object that can be used later. This will be very important when","we talk about Filter activities later. You can also make Filter Definitions under the Data","Filters folder in the Subscribers tab of Email Studio.","Filter refreshing options","As noted in the previous section, this is only related to filtered data extensions and not all","filter possibilities in Marketing Cloud. I mention this because there is no activity inside","Automation Studio to automatically refresh filtered data extensions like there is for lists","and mobile lists. The only way, outside of Filter activities, which we will go over next,","to automatically refresh these data extensions is through an undocumented REST","API endpoint.","\f156 Automating Your ETL and Data"," Undocumented Endpoints
One important thing to note about undocumented endpoints is that they are
not officially supported or prepared for use by the public, meaning that
there is a risk that things will break or slow down and Marketing Cloud will
do nothing to help you resolve this issue as you should not have been touching
it in the first place. This means, if you are doing something that is significant
or highly important in production, you likely should not use this method
as there is the risk of failure due to Marketing Cloud deciding to shut it down
completely or otherwise change it, thereby breaking your entire process.","Now that we know the risks and understand that this should not really be used in
a production environment, let's do a quick exploration of this endpoint:"," Method: POST
Endpoint: /email/v1/filteredCustomObjects/{{filterDEid}}/
refresh
Host: {{tenantSubDomain}}.rest.marketingcloudapis.com
Authorization: Bearer {{auth_token}}
Content-Type: application/json","You will notice here that there are a few pieces that have curly brackets before and after.
These are my placeholders that you need to enter your values in. Following is a description
of what each means:"," • FilterDEid: This is the object ID of the filtered data extension you are targeting.
You can gather this via an API or through some investigation in the UI.
• TenantSubDomain: This is your tenant-specific endpoint that you can find in
your API package or similar places in the UI.
• Auth_Token: This is the OAuth token that is generated via a separate API call to
authenticate and provide context for all other API calls.","As noted, this is the only way to refresh it, and the only way inside Salesforce Marketing
Cloud that you can do this is via SSJS Script activity or the cloud page using SSJS. Well,
technically, you can use AMPscript as it is a POST call, but I would highly recommend
SSJS for all API calls where possible. Now that we have seen that the refresh and
automation options for one-time use filters are severely limited, let's move on to
Filter activities.
\f Filter activities and data segmentation 157","Filter activities
Filter activities can be created inside Automation Studio, but they are reliant on Filter
Definitions being created inside Email Studio. The best part about Filter activities is that
they are native to Automation Studio, so can easily be inserted into an automation. This is
important because the Filter activity will not only create a new filtered data extension, but
will also refresh the filtered data extension if it already exists! This is the only way to utilize
filtered data extensions inside an automation outside of the API, which can be a bit much.
So, what exactly makes up a Filter activity? At its core, a Filter activity is built upon the
filter definition that you create in Email Studio. This defines the data extension you want
to use as a source as well as the filter criteria you want to use. From there, you just assign
the name, external key, and the other details of the new filtered data extension that is
created to house all your data."," Note
I am focusing on Filter activities in relation to data extensions for this section,
but Filter activities can also be used to create new groups by targeting lists
instead. This will be discussed more in the Groups section following this one.","The following screenshot shows the UI setup wizard around creating a new Filter activity
and selecting the correct filter definition:"," Figure 5.10 – Example of the setup for creating a new filter activity
\f158 Automating Your ETL and Data","After this, you then just complete the Name, External Key, and Description fields of the
new target data extension you want to create, and then the Filter activity is done! Now,
one important thing to keep in mind regarding this is that Filter activities will only create
new data extensions and not target one-time-use filtered data extensions you already
created. After the initial build, it will refresh that Filter activity-built DE, but that is the
only time it will refresh. Next up, let's move on to Groups, as they are fairly similar to data
extension filters.","Groups
Groups are the list equivalent of filters for data extensions. These can only be created
inside Email Studio or via a Filter activity (described previously). The difference is that
for one-time use groupings, there is an Automation Studio activity to refresh them
automatically. This can be super helpful as you can create it via the UI and still automate
the refresh without needing to recreate it in a Filter activity and refresh via that activity.
The following screenshot shows the UI for creating a filtered group. You will notice that
this is very similar to the Filter Definition UI."," Figure 5.11 – Example of the setup for creating a new filtered group
The issue here, however, is that lists are no longer the data storage of choice as data
extensions tend to be much more customizable, powerful, and performant than lists – so
lists are not used beyond subscription status. And things such as publication lists do not
allow you to filter them and create groups, so this severely limits the usefulness of groups.
You can use Groups on All Subscribers, but it does not let you have access to things such
as Subscription Status or anything like that, which once again limits the capabilities.
Long story short, you are not likely to find many people who utilize groups, but if there
is a need, this can be pretty well automated. I tend to find that for whatever I would have
wanted to try and grab through groups, I can find more, if not better, info from the filtered
mobile lists.
\f Filter activities and data segmentation 159","Filtered mobile lists","Although this is taking place in MobileConnect (or Contact Builder), the capabilities","of these lists are not limited to just mobile contacts. The other great thing about this is","that mobile lists are technically data extensions, so they are able to be queried in a limited","capacity. This can be huge for some use cases. For instance, if you need to delete contacts","that have no channels, you can use a filtered mobile list to get this done. There are other","ways in which this can be accomplished, such as via a channel address data extract type,","but to highlight the power of filtered mobile lists, we will use this method.","The best part about this is that there is an activity in Automation Studio that allows you","to refresh these lists at whatever schedule or triggered time period you want. This can","be a very powerful tool for keeping your contacts clean and trimmed up. The following","screenshot illustrates the UI for filtered mobile lists:"," Figure 5.12 – The UI for creating a filtered mobile list","As you can see from the screenshot, the other great thing about filtered mobile lists is","that they work based on attribute groups, allowing you to filter based on relational data","and not just want is inside that specific object. If you utilize contacts and attribute groups","for most of your messaging, then this may be a strong way for you to gather segmented","data. The only issue is that the creation of these lists is not able to be automated, just the","refreshing part. So, you would have to manually build each and every one of the mobile","lists you want to use.","\f160 Automating Your ETL and Data","Next, we are going to dive into the Import Data options. These options show how to
ingest data that has already been extracted and transformed in Marketing Cloud.","Import data options","For this section, we will be concentrating on just the import options that can be","automated and not discussing the UI import options. The main option for this is Import","activity. Import activity requires the use of a Marketing Cloud File Location to ingest data,","so there will be some considerations when utilizing it. You can create this File Location","inside the Administrator tab in Email Studio or inside the Setup section.","Essentially, this activity will take a flat file (CSV, TXT, and so on) from the SFTP and then","load it inside a data extension or list, separated by a specified delimiter in the file. This","process can be automated inside the Import activity in Automation Studio and can be","completed via a schedule or via a file drop trigger.","There are other options that can be used to ingest data, such as the manual import wizard","in the user interface, but these are not capable of being automated. Due to that fact, I am","not going to be covering them.","The only other option for ingesting data is by utilizing the Salesforce Marketing Cloud","API. There are options to accomplish this in both the SOAP and REST APIs, but the APIs","are something we will go into further in the book. For now, we will be moving on to the","export data options.","Export data options","Although things were pretty straightforward and simple as regards importing data,","exporting data from Marketing Cloud has tons of options. Even limiting it to just those","export options that you can automate, you still have a couple of options:"," • Data Extract: Exporting data to a flat file from a data extension."," • Tracking Extract: Exporting tracking data (open, click, and so on) into a flat file."," • Other: There are many other options, including Convert XML,"," GlobalUnsubImport, and Zip Data Extract.","As a note though, I am not going to be diving too deep into the Other section as this is
not really relevant to what we are discussing and I have found most of these options to be
more niche case uses and not something you see utilized very often. Let's now dive into
data extract and take a look at how this can be helpful.
\f Export data options 161","Data extract
This Automation Studio activity extracts information from a data extension and places
it in a delimited file. Honestly, it is pretty simple to use with the options you would
imagine it would have. There is not much to it, but it can be automated, which is greatly
helpful for the ingestion of outside systems.
That being said, there are a ton of different data extract options available to help with
your export or file manipulation needs. For the sake of this section, I will limit it to a data
extract from a data extension as well as to tracking extracts.
The part that needs to be considered regarding an extract of a data extension is that after
the extract is run, the flat file is not created in an accessible place; it is first created inside
the Safehouse. You then need to use a File Transfer activity to move the file from the
Safehouse into the file location you want it to go to. Let's dive a bit deeper into the File
Transfer activity before moving forward.","File Transfer activity","File Transfer activity has a ton of uses in Marketing Cloud, but for this, we will be","focusing on the move from extracting from the Safehouse to the target file location. File","Transfer activity has two options that show the type of action you are looking to perform","with the file transfer. The following screenshot shows the two options in the UI:"," Figure 5.13 – Two action options available for File Transfer","\f162 Automating Your ETL and Data","For our needs (data extraction), we will be selecting Move a File From Safehouse.
From there, we need to enter the filename, or file naming pattern, associated with the
exported data. You do this by selecting a destination from your stored file locations in
your admin settings.
Next, you will see a few different transfer settings. This is essentially used if you want to
encrypt the file when you move it from the Safehouse to the target file location. The two
options available here are PGP and GPG, and you can select the desired public key from
Key Storage in Admin settings.
After you run the data extract followed by the File Transfer, your file will be sent to the
target location and will be available for ingestion by whatever outside service or data
warehouse you want. Next, let's look at Tracking Extract.","Tracking extract
Tracking extracts pull data directly from the backend of Salesforce Marketing Cloud and
get data that users cannot easily access anywhere else other than via the API. This is
a great way to get information on interaction and tracking data inside Marketing Cloud
and pass it in a raw format to your analytics or reporting system for ingestion.
There are a ton of different options that can be performed in a tracking extract, but the
good news is that you can pick and choose which option(s) you want to include. The
following is a list of the available outputs from a tracking extract:
• Attributes
• Bounces
• ClickImpressions
• Clicks
• Conversions
• ListMembershipChanges
• Lists
• NotSent
• Opens
• SendJobImpressions
• SendJobs
• Sent
• SentImpression
\f Summary 163"," • StatusChanges
• Subscribers
• SurveyResponses
• Unsubs","Now, picking and choosing only what you need is very important as tracking extracts are
able to time out. This means that if you try to extract too much data at once, the whole
thing will fail. Now, another part of this that needs to be considered in relation to the
timeout limit is the date range associated with the extract.
Not only can you select the objects to output, but you can select a range of dates that you
want to extract from. The maximum rolling range is 90 days, which means it is a 90-day
look back from the current date, or you can set a specific range up to 30 days in length,
and that range can go back to account inception and is not limited to 6 months, like Data
Views. For accounts that have significant data, this can mean that you need to keep the
date range low for each extract, or it could time out and fail.","Summary
With that, you now have a strong understanding of ETL and data manipulation inside
Marketing Cloud Automation Studio. Whether you are looking to use SQL queries
or import or extract files, you should now have all the information you need to maximize
the use of these activities in your automation.
You have dived deep into what exactly ETL is and now have a strong enough
understanding of it in general, as well as how it relates in Marketing Cloud, that you can
likely impress your friends at the next party with how smart you are. Well, if they are also
into nerdy technical things like us, otherwise they might just give you a blank stare as you
try to explain it.
We also learned that for Transform, the two activities we want to concentrate on are the
SQL Query activity and Filter activities. These offer the best segmentation options,
with SQL Query activities offering a more versatile capability of transforming outside
of just segmentation.
Next, we are going to be moving on to one of my favorite topics in Salesforce Marketing
Cloud – Script activities in Automation Studio. Script activities offer you so many
options and capabilities through the ability to use in-step scripting capabilities inside of
an automation. This includes making API calls and interacting with system objects in
Marketing Cloud, such as emails and data extensions.
\f 6
The Magic of
Script Activities
Data and extract, transform, load (ETL) are a highly impressive part of the capabilities of
Marketing Cloud, but what I feel are the most powerful capabilities all start with the magic
of Script activities. Script activities are part of Automation Studio activities and allow you
to execute server-side JavaScript (SSJS) within automation. Within this chapter, we will
dig into the following topics:"," • Script activities (SSJS and AMPscript): What Script activities are in relation to
Marketing Cloud and automation, as well as how SSJS and AMPscript are used
inside Script activities.
• Overview of SSJS in Marketing Cloud: SSJS is the proprietary language of Marketing
Cloud and is required to be used in a Script activity.
• Oh, the possibilities?! (What you can do in SSJS): This takes that base knowledge of
SSJS and shows you how you can use it and the possibilities it unleashes.
• A real-life example implementation: Now that we know what is possible, let's dig into
a real-life example usage of SSJS inside of a Script activity.
\f166 The Magic of Script Activities","I must admit that Script activities are one of my favorite capabilities of Salesforce
Marketing Cloud (SFMC), so I am very excited to write this and the next couple of
chapters. Script activities are a great introduction to the proprietary language of SSJS
inside of Marketing Cloud, and SSJS and Script activities open up so many doors for
custom actions, implementations, executions, reporting, analytics, and more!
Before we get too far ahead of ourselves, let's start out by defining exactly what a Script
activity is and how to use it.","Technical requirements
The full code for the chapter can be found in the GitHub repository located here:
https://github.com/PacktPublishing/Automating-Salesforce-
Marketing-Cloud/tree/main/Chapter06.","Script activities (SSJS and AMPscript)","Inside of Automation Studio, under Activities, you will find the most wonderful activity","named Script activity. Now, what makes this so wonderful? Well, it opens the door to","do things beyond what you can do in the user interface (UI) and allows you to build","a scheduled script that can be run with a 30-minute time-out. OK—cool, but what","exactly is it?","A Script activity in Marketing Cloud is an activity that lets you write a script utilizing","SSJS inside of a stored object, the Script activity itself, that you can then execute within","automation. You can run this as a part of automation or as the whole automation itself,","as well as utilize it as both automation types: Scheduled or File Drop."," Note
If you do not see this activity inside of Automation Studio, contact Salesforce
support or your relationship manager to activate it.","A Script activity is one of the most powerful activities in Automation Studio due to
its ability to include scripting using SSJS. This capability allows you to automate many
wonderful things, including preparing content for sends, creating micro-automations
inside automation, getting external content, storing/utilizing it in automation, and more.
Can I give an example? Sure!
\f Script activities (SSJS and AMPscript) 167","Let's say you have a huge amount of personalized newsletter content that you need to pull
from an external resource. For the sake of simplicity, let's say it's all hosted on an endpoint
that you can target with a REpresentational State Transfer (REST) call. Now, you could
have all this done inside the email in real time with proper scripting in the email, but that
could really bloat the send time to the point that it might make the email unusable,
or even time out or error. This is where a Script activity comes in.
The Script activity would run in the step before you send an email (assuming it is a bulk
send through Automation Studio), whereby it will run through each of the application
programming interface (API) calls to collect the content from the external source and
then take that returned data and store it inside of a data extension (DE). You then can just
reference the DE on the live email, greatly reducing the time it takes to send each email
due to the reduction in processing required.
Now, this is just one of the million uses of a Script activity, but I feel it is a great way to
show how Script activities can make what might be impossible elsewhere in the platform
suddenly become possible. Utilizing SSJS inside of a Script activity opens many doors
that were previously closed.
As you may have noticed in my explanation, I mention it only uses SSJS, and do not
mention AMPscript. Yep! That's right—this will only accept SSJS inside of it. But if you
do want to use AMPscript inside of a Script activity, you have a couple of options
available to you.","AMPscript inside of a Script activity","First is the option of calling in a content block via SSJS that has all your AMPscript in it.","The strange thing is that the validator will only allow SSJS, but if you trick it, it will run","AMPscript perfectly. Now, this may, at face value, seem like: What is going on? Why would","they not just open it to both if it can process both?","Well, I do not have a definitive answer, but I do have my theory. I think this works because","the AMPscript in the content block is being rendered before it enters the Script activity","environment. This means that the Script activity is not processing the AMPscript—it is","just receiving the results.","Here is an example of using a content block call to pull in AMPscript:","Script activity
\f168 The Magic of Script Activities","In this example, ampscriptContentBlock represents a content block from the","content builder holding all your AMPscript. I would highly recommend using a code","snippet content block only for this, as the added HyperText Markup Language (HTML)","that comes from most other content blocks might cause issues.","So, say—for example—we had a content block with the ampscriptContentBlock","external key and it had the following AMPscript in it:","AMPscript content block"," %%["," SET @sde = \"mySourceDE\""," SET @tde = \"myTargetDE\""," SET @rs = LookupRows(@sde, \"HasBeenRun\", \"False\")"," SET @rc = RowCount(@rs)"," FOR @i=1 TO @rc DO"," SET @row = ROW(@rs,@i)"," SET @pkey = FIELD(@row,\"pkey\")"," SET @val2 = FIELD(@row,\"val2\")"," SET @val3 = FIELD(@row,\"val3\")"," UPSERTDATA(@tde, 1, \"pkey\",@pkey,\"val2\",@val2,\"val3\",@val3)"," NEXT @i"," ]%%","This would then take data from mySourceDE and push the specific fields called out
(pkey, val2, and val3) into the target (myTargetDE) via the UPSERTDATA function.
This would be run every time the automation runs, allowing for you to automate some
scripted updates.
\f Overview of SSJS in Marketing Cloud 169","The second option is to do it inline inside of the SSJS. This sounds odd, but it actually
works really well, especially when you want to use AMPscript functions that are not
replicated in SSJS without creating them yourself. Here is a basic example showing the use
of the ProperCase function, which does not exist inside of SSJS:","SSJS inline AMPscript"," ","This code will try to do a HyperText Transfer Protocol (HTTP) GET request to the
https://www.example.com address, and if it works, it will then store the results
in the resp variable. If it fails, though, it will instead write a stringified version of the
returned exception object to the page (this does not work for Script activities). This is
a great way to help figure out what is erroring out, as most times on a CloudPage, it will
only return a 500 error and give no details as to why it failed.
Another note around try...catch statements is that any variables declared and
defined inside of them are considered local variables and cannot be used outside of that
statement. In order to save the changes made in a try...catch statement, you need to
set those variables globally prior to the statement, much like you will in a function. Now
that we have covered the major native capabilities, let's dig into some of the proprietary
libraries inside of Marketing Cloud SSJS.
\f186 The Magic of Script Activities","More capabilities than we can mention","There are a ton of different capabilities and features native to JavaScript that are part of","Marketing Cloud's SSJS that we could discuss, such as switch statements, eval, error","handling, and more. For now, I want to give a small overview of a platform-specific","capability that will be coming up in more detail in the next chapter. For all the other","capabilities, I would look at the official JavaScript documents or any of the tons of great","blogs and forums out there. For now, let's learn more about WSProxy.","WSProxy
WSProxy is essentially a JavaScript version of making SOAP API calls within Marketing
Cloud that receive and return using JSON objects and arrays. This helps to simplify and
optimize API calls to platform objects to get and interact with the data and content stored
there. I can tell you that this is one of the best features that have ever been released for
Marketing Cloud developers.
Now that we have a strong overview of what SSJS can do, let's dig into an example Script
activity utilizing SSJS.","A real-life example implementation","So, now that we have a good understanding of SSJS in Marketing Cloud, let's take a real-","life example to run through. Next is a script that you can run inside automation to create","a log of the queues inside your triggered sends. Now, to help ensure it only captures those","that are relevant, we will be placing a minimum number queued in order for it to be","considered a failure and be logged.","I am sharing the full file inside of a GitHub repository for easy access here: https://","Cloud/tree/main/Chapter06. You will notice that this block utilizes arrays, objects,","functions, and WSProxy. Because of the length and complexity of the script, I will be","breaking it out into sections to explain it.","Setup
So, basically, you would put the GitHub script inside of a Script activity in Automation
Studio, set it in automation, and have it run hourly. It will then grab the triggered sends
that have a queued value that is above 500 and store it in a DE. Now, as a note, it will
not upsert to the DE if there were no failures, so it will not bog down the log DE with
unnecessary empty entries.
\f A real-life example implementation 187","In order to use this inside your SFMC instance, you will need to create a DE to hold the
log data. For the sake of the easiest storage possible, we will be just using one field to hold
a stringified array listing the failed triggered send definitions and their queues. Here's
what you need in order to set up this DE:"," • Data Extension Name: myQueryQueue_LogDE"," • Data Extension External Key: myQueryQueue_LogDE"," • Data Extension Fields and Properties: These are listed in the following table:"," Table 6.1 – Fields and properties in myQueryQueue_LogDE DE","Then you can go back and reference this DE periodically to verify time periods where","your triggered sends may be queueing up more than they should be or if there are any","issues where the trigger starts to fail and queues up instead of erroring.","Now, let's dig into each part, starting with the functions.","Functions
The functions are a great place to start as they are the workhorse of the script. Now, in
this case, we did not need to have these built out as functions as they are only being called
once. As I stated earlier, though, I like to create functions whenever and wherever I can
because of the ability to reuse them easily. The first function we are going to look at is the
getTSDKeys function.","getTSDKeys function
As you can see in the following code snippet taken from the full block of code,
the function is actually pretty simple. The function utilizes an input (the midpoint
of the business unit (BU) you want to look into) to run a WSProxy call to the
TriggeredSendDefinition SOAP object to gather CustomerKey (also known as
the external key) of each triggered send definition in that account. This being said, in the
current state, it will only return up to 2,500 triggered sends, so it is recommended to pass
in a filter or create pagination if you have more than 2,500 triggers.
\f188 The Magic of Script Activities","Let's take a look at the code again to remind ourselves:"," function getTSDKeys(mid) {"," /* Set ClientID */
if (mid) {"," prox.setClientId({ \"ID\": mid }); //Impersonates the BU"," var cols = [\"CustomerKey\", \"TriggeredSendStatus\"];"," var res = prox.retrieve(\"TriggeredSendDefinition\","," cols, filter);"," return res;","As you may have noticed, this function does not set a new WSProxy initiation on each
run. This is because it is initiated in the global variables, and to limit processing power
and increase efficiency and performance, we keep reusing the same initiation. The rest is
pretty straightforward, with prox.setClientId setting the correct BU environment
(this only works from the parent BU to the child BU) and then the execution of the call.
Next up, we will look into the getTSDQueue function, which is a bit more complex than
the getTSDKeys function.","getTSDQueue function
The following code snippet will show the getTSDQueue function taken from the full
code block we shared previously. This code is used to gather the number of queued
individuals for that specific triggered send definition. This is handled through WSProxy
as well, hitting the TriggeredSendSummary SOAP object.
\f A real-life example implementation 189","Let's look at the code for this function outside the context of the rest of the Script activity,
as follows:
function getTSDQueue(customerKey) {"," var cols = [\"CustomerKey\",\"Queued\"];"," var filter = {"," Property: \"CustomerKey\","," SimpleOperator: \"Equals\","," Value: customerKey"," };"," var res = prox.retrieve(\"TriggeredSendSummary\","," var queue = res.Results[0].Queued"," return queue;","Similar to the getTSDKeys function, this function does not set a new WSProxy initiation
on each run to limit processing power and increase efficiency and performance, and also
uses prox.setClientId to set the correct BU environment.
The major difference, outside targeting different SOAP objects, is that this one uses a filter.
This filter limits the results to a single data object—the one that matches customerkey
we gathered from the previous function. This means that to get the queue number for each
of the keys returned, you will need to loop and iterate through this function. It is also the
reason why the returned variable queue returns a specific part of the results and not the
whole thing, unlike the getTSDKeys function.
Now that we have finished with functions, let's take a look at the global variable setting.
\f190 The Magic of Script Activities","Global variables
This section of the code may be small, but it is highly important as it sets the stage for you
to be able to execute the statements you set later inside the code block. Let's take a look at
the variables first to refresh our memory:"," // Global variables
var prox = new Script.Util.WSProxy();"," var mid = 123456;"," var failures = 0;"," var allTriggers = 1; // All triggers in BU, or false (0) if"," // want specific filters only"," var alertArray = []","This is where we declare and initiate our WSProxy via the prox variable. It also is where
we set the default for our two collection variables, alertArray and failures. This
also sets whether we want all triggers from this account or if we want just specific ones.
This logic is further down and will be addressed shortly. Each one of these variables is vital
to the entire script, and if one were missing or set incorrectly, the whole script would fail.
Let's next take a look at the allTriggers or specific triggers' logic.","allTriggers logic
This section of code is where we determine whether we will use an array of all the
triggered send definitions or whether we only want to use specific ones we listed. This
logic is based on the allTriggers global variable. If this is set to 1, or true, then
we will use the getTSDKeys function and gather all the keys into an array. If not, then
we will use the prepared array we have set in the else statement. By using specific
customerkeys only, we also can set custom queue lengths to be included and used to
test for failure.
\f A real-life example implementation 191","Let's dive into the code taken from the full block shared previously to take a look,
as follows:"," if (allTriggers) {"," var tsdArray = getTSDKeys(mid);"," var length = tsdArray.Results.length;"," } else {"," var tsdArray =
[\"TriggerA\",\"TriggerB\",\"TriggerC\",\"TriggerD\",\"TriggerE\"]
// External Keys of the Triggers you want to check."," var length = tsdArray.length;"," var maxQueueArray = [500,500,500,500,500];"," // Enter max queue here. Can make an array as well, if"," // different maxes per TSD"," if (!maxQueueArray || maxQueueArray.length == 0) {"," var maxQueueDefault = 500; // Default for if not using"," // maxQueueArray","As you may have noticed, there is a section after the allTriggers logic that focuses on
the maxQueue variables. This is because we want to have a default amount set so that if
there is no custom array of maximum number before being considered a failure, we still
want to have a value to use in our comparison.
Let's move forward and examine the for loop that is iterating through and gathering the
specific queue information.
\f192 The Magic of Script Activities","for loop
This is the part where the baker takes all the ingredients and actually makes the pie. Very
exciting! As you can see here, the loop iterates through the tsdArray array set in the
allTriggers logic we explored previously, starting at the 0 index and moving up to
the total size of the array, all while incrementing by one for each run. There is a ton of
information here, so let's review the code before diving in deeper, as follows:"," for (i=0; i < length; i++) {"," if(allTriggers) {"," var customerKey = tsdArray.Results[i].CustomerKey"," } else {"," var customerKey = tsdArray[i]"," var queued = getTSDQueue(customerKey);"," var queueArrLength = maxQueueArray.length;"," // changes maxQueue to array value if exist and equal to i"," if (maxQueueArray.length > 0 &&"," maxQueueArray.length <= i) {"," var maxQueue = maxQueueArray[i];"," var maxQueue = maxQueueDefault;"," if (queued > maxQueue) {","\f A real-life example implementation 193"," //creates the failure object"," var obj = {}"," obj.customerkey = customerKey;"," obj.queue = queued;"," //pushes the failure obj to array"," alertArray.push(obj)"," failures += 1 //increases failure count","As you may have noticed, there is another reference to allTriggers here to gather the
customerKey value. This is because the two arrays are formed differently, so to account
for that, we need to specify them differently. After that, we grab the queued number from
the function and then examine it against the maxQueue value. Now, to get the maxQueue
value, we need to check whether it is a custom array or the default value; so, with a bit of
logic, we determine that and set the correct number.
We will then check whether the queued number we received is higher than the maxQueue
value, and if so, we then create a failure object that contains the customerKey value
as well as the queue count and then push this into our alertArray collection array. This
is then followed by increasing the count of failures by 1.
Finally, we will explore the last piece of the code—the upsert.","The upsert
This one, as with the global variable sections, is small, but vital and powerful. Without this
section of code, none of the rest would mean anything. This is where we actually take the
information we have gathered and push it into a log DE that allows us to view it.
Let's take a look at the code, as follows:"," if (failures > 0) {"," // Upserts into log DE"," var rows ="," Platform.Function.UpsertData(\"myQueryQueue_LogDE\","," [\"MID\",\"TimeStamp\"], [mid,getDate()],","\f194 The Magic of Script Activities"," [\"QueueAlertArrayStr\"], [Stringify(alertArray)])","As I said, it's simple with a bit of logic and then a Platform library function to push
the information into a DE. As to why logic is needed, well—if none of the triggers failed,
there really is no reason to log this and potentially bog down your DE; instead, it's better
to just assume that if there is no entry, then that means it passed. This is why it looks to
see whetherfailures is above 0, as that value only increases if something is added to the
alertArray array, as we saw previously.
We then use the UpsertData platform function to take the information we gathered and
push a stringified version of the alertArray array into the DE. This allows you to view
the values as well as to store the array in a JSON format so that you can grab it, parse it,
and then use it in another context if needed.","Summary
Well, that's it for the general overview. That is not to say our journey into Script activities
and SSJS is done. Far from it. Like, super far from it!
In this chapter, we went over what a Script activity is and how it interacts with SSJS and
AMPscript, which helped us to understand the basics of syntax on a Script activity in
the Marketing Cloud. We followed this with an in-depth overview of SSJS in Marketing
Cloud, which helped to give us the background inside of SSJS that we needed in order
to learn about the capabilities of a Script activity. From there, we began to dive into the
amazing capabilities that you have when using SSJS in a Script activity, such as utilizing
Core functions to automate some repeatable tasks via scripts. Then, we ended our chapter
with a real-life example of what a Script activity can be and how it was built.
In the next chapter, we will be diving into API capabilities in SSJS as well as Script
activities in general. This is where we will dive into WSProxy, what the SOAP API is, what
the REST API is, and how to use each of them. I know I said SSJS was my favorite, but
I think SSJS combined with APIs is my real favorite. Are you as excited as me? Let's move
on to the next chapter and see!
\f 7
The Power of
In-Step APIs
Now that you've been introduced to the magic of Script activities, we wanted to bring
you to the next level up – in-step API execution. What exactly is in-step application
programming interface (API) execution? For us to define that and how to utilize it to
make your Marketing Cloud capabilities so much more powerful, we need to cover the
following topics:"," • What are APIs? A very quick general overview of APIs."," • The REST and SOAP APIs in Marketing Cloud: An overview of the two types of API"," (REST and SOAP) that are available inside Marketing Cloud."," • How are the native scripting functions and WSProxy best used in the platform?"," Here, we will look at some of the simpler, native functions in the platform around"," API calls, as well as at utilizing WSProxy, which is one of the easier ways to handle"," in-platform API calls."," • How can we utilize the Script.Util functions to make REST API calls? Here,"," we will look at ways to go outside the limits of the native functions and utilize"," all the available methods when you're making your API calls, both internal"," and external.","\f196 The Power of In-Step APIs"," • The capabilities and power of sending API to an external service? Although there is
a ton of internal capabilities that APIs unlock, they also open up the possibility for
integrations and external influence on the platform.
• How does this all come together? In this section, we will take a deep dive into why all
of this is important and powerful.","The APIs in Marketing Cloud open up doors that you may not have even known existed.
I can equate the difference in the capability to that of going from AMPscript to server-side
JavaScript (SSJS). Sure, AMPscript is powerful and performant, but SSJS can do a ton
more. We can see that SSJS is limited compared to what you can retrieve or accomplish
with API calls. Even just using WSProxy opens up all kinds of new possibilities.
This is a good segue into my first introductory topic – what are APIs?","What are APIs?","So, without going into the heavy technical aspects, APIs are a way for applications to","exchange data and functionality. A good example is a telephone call. Let's imagine that","we have Joe and Judith who are best friends and looking to get together to hang out later","today. Now, neither one of them has talked yet to see whether the other is available to play,","so the first thing they need to do is get in contact with each other.","For them to communicate and make a plan, they need to talk. To do that, Joe calls Judith","on her phone. Judith then picks up the phone, recognizing it is Joe, and says hi and waits","to hear Joe respond on the other end. After they say their salutations, they work to catch","up on how each of them has been doing since the last time they talked. Then, they get to","the crux of the call: Joe brings up when Judith will be available to hang out with him at","his place. Judith says she is free to come over in 30 minutes. From there, they both say","goodbye and hang up.","This was just a perfect example of an API call. How? Let's take a look:"," 1. The first thing they do is get on the phone. This opens a line of communication
between them – think of a server initiating a request to send across.
2. From there, they chat and do a handshake to authenticate they are who they say
they are (they recognize their voices, call from a specific number, state their names,
and similar indicators). This is the validation and authentication to ensure it is
a secure process and the correct service.
3. After this, information is transferred (talking about their day, discussing when
to hang out, what they want to play, and so on). This is where the payload of the
request is shared with the context of the receiving service.
\f What are APIs? 197"," 4. Next, we have the transaction, where the time and location of the meeting are
decided. Now that all the necessary information and context have been shared
between the services, the action that was requested is returned with a response.
5. Finally, they finish communicating and continue separately. After the response, the
services disconnect from each other as the transaction is complete.","At this point, we have a general idea of what it is and how it works, but why would
we need to have API calls? API calls allow never-before-seen levels of access across related
resources while also retaining things such as security and control. Through this and the
speed they offer, APIs have become a very valuable and almost indispensable part of
modern business. To understand why we need to have API calls, I want to go over some of
the major benefits they offer:"," • Improved connection: With the massive amounts of cloud applications, servers,"," and services out there, creating a bridge to connect these allows for better"," integration so that each disconnected application can seamlessly communicate with"," each other."," • Robust data capabilities: With the strength, speed, and flexibility that is offered by"," APIs, you can take significant amounts of data and transfer it into new systems"," or applications at a volume and speed that is not likely to be matched elsewhere."," It also takes the data and reduces the number of contexts it has to shift through,"," including formatting as a data file (CSV, TXT, XLS, and so on) and then transferring"," that file through a couple of FTP environments and import/transfer processes."," • Protection and security: This is probably the biggest selling point of APIs. The level"," of encryption, protection, authentication, and more that you can utilize in APIs to"," protect your data is quite astounding. As with all things, this is not irrefutable, but"," it can provide stronger protection and accurate transfers compared to most of the"," other available options."," • Increases the path to innovations: With the more connections and more paths"," available, the more options and new architecture and processes we can build. This"," allows us to explore and innovate to find new best practices and custom solutions.","There are a ton of things surrounding APIs and the technical processes, procedures, and
methods, but a good portion of that is not relevant to our discussion around API usage in
Salesforce Marketing Cloud. To that extent, we are going to explain the two different types
of API that are available in Marketing Cloud.
\f198 The Power of In-Step APIs","The REST and SOAP APIs","Some of the Marketing Cloud developers that have been in the platform for many years","may remember that there was a third option for API in Marketing Cloud – XML. Now,","rightly so, when I say that, you should shudder with horror as it was not only one of the","most antiquated and overly complex API types, but it was, by far, a security nightmare.","Thankfully, this option has been retired and is no longer an option.","Now, we are left with just two different types of API in Marketing Cloud – SOAP and","REST. Let's take a moment to introduce what each is. Note that we will only be going over","how it relates to Marketing Cloud via the HTTP protocol – there is a lot beyond what each","type can do:"," • SOAP API: SOAP stands for Simple Object Access Protocol, which is a messaging
standard protocol system. SOAP utilizes XML to declare its messages and relies on
XML schemas and other technologies concerning structure.
• REST API: REST stands for REpresentational State Transfer, which is a set
of architectural constraints, not a protocol or standard. This allows for easier
implementation and usage.","Inside Marketing Cloud, these are mostly different in terms of their capabilities, with
the REST API being the one that interacts with the more current and new functions and
capabilities, whereas the SOAP API is more in line with the classic ExactTarget age
stuff. Before we dive too deep into these APIs and how they work inside Marketing Cloud,
let's look at the authentication that's required for interacting with Marketing Cloud.","Authentication
In the past, there were three ways you could authenticate an API call in Marketing Cloud:"," • Username/password
• Legacy package OAuth
• OAuth 2.0 enhanced packages","Now, although more options is usually a good thing, in this case, the options were leaving
open security risks and causing issues with the scope of permissions and capabilities.
Due to these security risks, the first two options have been severely limited or completely
turned off. Let's dig into the first option.
\f The REST and SOAP APIs 199","Username and password authentication","The username and password authentication methods were only available in the SOAP","API. It was not a possibility for the REST API. This was one of the original authentication","methods from many years ago and was based on security protocols and protections from","years ago.","This option sounds like it should be secure, right? I mean, that is how you log in to most","websites, including bank accounts and other highly secure environments. Well, in those","cases, this is usually because as the information is being transported over a call to the","other secure environments, it is encrypted and obfuscated in a way that makes it hard for","anyone to gain access to this information.","However, since the username and password were passed inside the XML body of the","SOAP call, a lot of that capability is not possible, which means it is much more vulnerable","for malicious people to gain access to it. Not only this, but if someone were to take this,","they would not only be able to make API calls but also log in to your Marketing Cloud","account and access the UI.","We can see a big risk here, right? As the world has become more and more sophisticated","and malicious people have kept getting smarter and more capable, the less secure this","has become. This is why the username and password authentication option is no longer","allowed to be utilized without special permissions. Next, we will look at the original","authentication method for the REST API.","Legacy package OAuth token","To be fair, when this came out, it was not named Legacy – it was just the package OAuth","token authentication process. Legacy has been added to differentiate between this process","and the new OAuth token process.","This process was much more secure than the username and password process as it utilized","a specially made package that contained clientId and clientSecret to act as the","username and password for authentication. You would take these values and pass them","inside a JSON payload to the authentication endpoint, which then returns an access token","that is used for the following calls.","As we mentioned previously, the authentication call is completely separate, so the","ID and secret are not passed inside every call. Reducing the number of times it is","transmitted reduces the opportunity and risk of a malicious person being able to steal","this information. Now, you may be wondering, it stops the ID and secret, but what about","that token? Well, to limit the viability of it, they have the token only be valid for a limited","window (around 60 minutes). After that, the token is expired and is no longer viable.","\f200 The Power of In-Step APIs","Now, this is certainly leaps and bounds above the username and password, but it is now
considered a legacy for a reason. There are a few different risks and security holes inside
of this that are taken care of with the next option.","OAuth 2.0 enhanced packages","Over the years, security and protection became a higher priority as the malicious few kept","getting more and more sophisticated, and more businesses were moving over to a digital","format. This combination opens up more targets with higher values, which makes it more","tempting for those talented people to turn to malicious means to get rich quickly.","To help limit some of the new threats and better lower risks, Marketing Cloud introduced","a new process for creating OAuth tokens in its packages. This new enhanced package also","introduced the concept of tenant-specific endpoints for utilizing APIs.","Tenant-specific endpoints
Before this, you would use a generic API endpoint that was only different based on the
stack you were on. A stack in Marketing Cloud meant a server stack or the location of
the servers that were hosting your instance of Salesforce Marketing Cloud. Now, this
is dangerous as it means all those who have an instance in that stack come in at the
same place – they just use a different key to open their specific door. This gets someone
malicious most of the way to being able to gain access to someone else's stuff.
For example, let's view this like safety deposit boxes inside a bank, but at a place where
millions, if not billions, of dollars could be stored inside some of the boxes. You have
a box, so you can get through the door and into the room that contains all the boxes. Now,
all you need to do is covertly start testing to find the keys for the other boxes. If you do
this covertly, this can give you unfettered access to all kinds of important information, as
well as the capability to completely demolish or destroy the entire account and be able to
do so pretty much undetected. Next, we will go over the permission scope capabilities","Integration type and permission scope","Although technically two separate aspects, they were similar enough to combine into","a single section. First, let's look at what an integration type is. The integration options in","Marketing Cloud are as follows:"," • Server-to-Server: This is a direct connection through two secure environments."," • Web App: This can be described as a middleware solution that's sitting in a more"," public setting but still able to store a clientSecret value."," • Public App: This is almost completely publicly facing and cannot store anything"," that can't be viewed by a user.","\f The REST and SOAP APIs 201","As you go through this list, the process to get an authentication token becomes harder and
more obfuscated. This includes an extra endpoint being required for the web and public
app to validate authorization of your application before looking for an access token.
This layer, on top of tenant-specific endpoints, gives many awesome levels of security
that bring Marketing Cloud up to a level of protection it should be at. But this isn't done.
From there, they also included the capability to set permission scope – not just on the
package, but also on the call. So, by doing this, you can have a package with unfettered
access, but have the call you are making to get an access token limit that access to just the
specific need. This will mean that if someone steals that token, they will only have partial
access to do anything until the token times out. Speaking of timeouts, they have shortened
this window down to 20 minutes to lessen the timeframe where someone can utilize any
ill-gotten tokens.
Now that we have a good understanding of the authentication capabilities, let's explore the
SOAP API inside Marketing Cloud.","The SOAP API in Marketing Cloud","The SOAP API in Marketing Cloud only interacts with things you would find inside","Automation Studio and Email Studio. The other Studio platforms will require you to use","the REST API. Now, although it is limited to just these two, this does not mean that the","SOAP API is not robust or powerful! Email Studio includes things such as Triggered","Emails, Data Extensions, All Subscribers, Email Sends, and Lists, while Automation","Studio includes things such as Automations, SQL Query activities, and Filter activities.","The SOAP API is designed to allow access to not just the capabilities of the UI, but also the","capabilities of administration and tracking, as well as data extraction. By using methods","and objects, the SOAP API can interact with and integrate with Marketing Cloud.","Methods
The first part of exploring what the SOAP API can do in Marketing Cloud is to explore the
available methods. Unlike the REST API, SOAP is only sent across the HTTP method of
POST. To help differentiate this action, methods are passed inside the envelope or header
that can show the specified action request.
\f202 The Power of In-Step APIs","To be honest, I have not even used half of these methods to perform my SOAP API calls,
and I have been working with the API for years. To that extent, do not get overwhelmed
by this list of possible methods as only around four or five of them are used regularly:
• Configure
• Create
• Delete
• Describe
• Execute
• Extract
• GetSystemStatus
• Perform
• Query
• Retrieve
• Schedule
• Update
• VersionInfo","Out of this list, the ones to pay the most attention to are Create, Delete, Describe,
Retrieve, and Update. Although some of the others, such as Perform, Configure,
and Execute, do have capabilities tied to them, they are not utilized very often, so you
will not need them as much.
Each method has specific parameters and responses that are included with it. For this
API call to be validated, the envelope must meet these required parameters. Next, we will
provide a quick overview of the objects in the Marketing Cloud SOAP API.","Objects
Methods are the action part and the request aspect, while objects are the targets that hold
all the information. These objects contain all the information or functionality to complete
or return the action or data you need.
There are a plethora of SOAP objects available in Marketing Cloud. I will not bore you by
listing every single one as there are legitimately around 262 documented ones and at least
a dozen more undocumented ones as well. Instead, I am going to group these objects into
families and provide a general overview of the possibilities.
\f The REST and SOAP APIs 203","Account
This family is related to the actual account level of your Marketing Cloud environment,
meaning you can create users, get business unit names or IDs, and so on.","Automation
This family is related to the automation processes inside Automation Studio. From here,
you can get the overall status, per-run stats, an overview of the automation process, as well
as interact, start, pause, edit, and create automation processes.","Automation activities
This family is related to activities inside Automation Studio (Extracts, File Transfer,
Filter, Imports, Query Activity, and others) and allows you to interact with these
capabilities.","Data extension
This family is related to data extensions, giving you full access to retrieve, create, delete,
and more. This includes at a subscriber or row level, field level, or even overall data
extension object level.","Emails
This family is related to all things regarding email messaging. This includes sending and
creating send definitions, scheduling, and more.","Events
This family is related to the tracking events objects. This includes Click, Open, Sent,
NotSent, Unsubscribe, and more. These objects tend to be similar to the data views
or tracking extract options.","Profiles and classifications","This family is related to the definitions of the sender profiles and classifications that are","used when sending messages. You can create, edit, delete, or otherwise manipulate","these definitions.","Subscribers and lists","This family is related to the subscriber model in Marketing Cloud. This model includes","lists. Through this family, you can interact at a macro or micro (row) level through","creation, deletion, updates, and so on.","\f204 The Power of In-Step APIs","Suppression and unsubscribes","This family is related to the suppression and unsubscribing process. This family will give","you control over ensuring those that you do not want to send to are not sent to.","Triggered sends
This family is related to the triggered send capabilities in Marketing Cloud. I have this
separate from the email family as this is focused on being utilized by APIs for a 1:1
real-time delivery, while the other is more focused on batch and scheduled sends.
Now that we have a good overview of the SOAP API's methods and objects, we can get
a good feeling for what it is capable of. Now, let's move on and explore the REST API's
capabilities in Marketing Cloud.","The REST API in Marketing Cloud","The REST API in Marketing Cloud is very powerful and has been designed to interact","with all of the newer Studio platforms. This includes, but is not limited to, Journey Builder,","Mobile Studio, Transactional Messaging, and Content Builder. The REST API utilizes","a method combined with a specific URL, instead of an object. Although they both use","methods, they use different types of methods. Let's take a look at the methods in the","REST API.","Method of REST (API)","Like the SOAP API, REST utilizes different methods to determine the type of action","that's performed. The difference is that the REST API utilizes HTTP methods, such as","the following:"," • PATCH: This sends an encrypted payload to interact with the target for the partial
update option only.
• POST: This sends an encrypted payload to interact with the target for the new
creation option only.
• GET: You can use this to retrieve data or information. No body is passed in the
request but it can pass parameters in the URL.
• DELETE: This sends an encrypted payload to interact with the target for the
deletion option only.
• PUT: This sends an encrypted payload to interact with the target for overwrite
updates, but if that object does not exist, then PUT may create it instead.
\f The REST and SOAP APIs 205","As you can see, these methods are very different from the SOAP methods. In general,
the actions are the same, just simplified down to the rawest level. Now that we know the
methods, let's explore the endpoints that the methods interact with.","The end(point) of your REST (API)","Similar to how SOAP has a ton of different objects, REST has a ton of different options","for endpoints. To help make things easier, what I am going to do is provide a list of the","different groupings that they have in the official documents and then list each of the","endpoint groups, including some undocumented ones for reference. First, let's explore","the different endpoint families that are listed in the documentation, starting with","Content Builder.","Content Builder API","The great thing about Content Builder is that it allows you to use content across all","different mediums and channels, which means that there is a reduction in duplication","or multiple instances of content or assets. The term asset is what Marketing Cloud uses","to refer to all the different types of content stored inside Content Builder.","The reason all of this is relevant is that through this API, you can interact with each of","these assets and fully automate your content management inside Marketing Cloud. This","API allows you to create, update, delete, retrieve, query, and even publish assets. It is by","far one of the most refined and robust API endpoint families available in Marketing Cloud","and one of my favorites to work with. Next, we will explore the Journey Builder API","endpoint family.","Journey Builder API","Journey Builder is heralded and lauded as the future of Marketing Cloud and with the vast","improvements that have been made over the years, it certainly seems to be true enough.","Journey Builder is a strong one-to-one messaging service with a focus on customizing and","personalizing messaging across multiple channels.","The API family allows you to interact with Journey Builder in quite amazing ways. You","can interact with almost every piece of a journey through the API, including building","a journey, firing events, pausing the journey, and accessing analytics and reporting.","Although this is not as refined as what is available in Content Builder, it is still a very","strong and awesome family of endpoints.","\f206 The Power of In-Step APIs","GroupConnect Chat Messaging API","GroupConnect is something that I have not had a lot of experience with. I know that its","capabilities are hugely valuable and I have played with it quite a few times. But since it","provides more service-based marketing, it is not something that comes up very often for","your average client. GroupConnect is Marketing Cloud's integration with your Facebook","Messenger and LINE application. Much like Mobile Studio, there is some setup and","connection that needs to be made to integrate this capability.","The API endpoint family here seems to be very useful, with a few known limitations, and","from my experience in it, it was pretty easy to use. This API family allows you to automate","these messaging channels, similar to email or SMS, by taking data from the system and","placing it in a templated message that is then personalized with that information and sent","to the user. This is sort of like if a shipment has gone out or a payment has been received.","MobileConnect API
Now, this family is defined as just MobileConnect in the documentation, but I like to
include MobilePush inside of it as well. They are both in the same studio in Marketing
Cloud, so it would only make sense to combine them here. MobileConnect is focused
around the SMS message channel, whereas MobilePush is focused on app-based
push notifications.
The APIs here provide nearly full integration with your Mobile Studio capabilities in both
applications. There is even a software development kit (SDK) for a few languages that are
available for these APIs. Although I highly prefer the Content Builder API family, this one
is highly robust and refined. It also offers a plethora of capabilities.","Transaction Messaging API","The Transaction Messaging API family is quite astonishing. It introduces a level of","flexibility, speed, and real-time interaction that is unmatched by any other process in","the application. Transactional Messaging is essentially a transactional, non-commercial","messaging service that strips all the unnecessary processing and procedures, such as","suppression or priority level, to ensure as fast and efficient a messaging process as possible.","The other awesome thing this family opens up is the Event Notification Service options.","This is something that allows an unprecedented level of immediate notifications and alerts","around events in Marketing Cloud.","\f The REST and SOAP APIs 207","Event Notification Service","Right now, this service is only applicable to the Transactional Messaging API family,","both email and SMS, but it is incredibly powerful and useful for those who need to make","real-time decisions to alert and to receive. These alerts are provided via a webhook,","which is a listening service that is hosted on a web service – this cannot be hosted in","a CloudPage."," Einstein Recommendations API"," As you may have noticed, I did not mention this endpoint family. This is"," because these endpoints exist in the igodigital.com domain and not"," the same Marketing Cloud domain as the rest, so I consider it to be an external"," API. Since I am concentrating on just internal API endpoints, I have not"," included it here. The igodigital API is a very niche group of endpoints, so there"," is not much general usage I could share in this book around it.","Endpoint groups
Now that we have explored the API families, I am going to share the different endpoint
groups that are available at the time of writing. For each, I will specify their names,
discovery endpoints, and provide brief descriptions:"," • Address: /address/v1/rest (mostly documented) is pretty much used only for"," validating email and phone numbers in Marketing Cloud."," • Asset: /asset/v1/rest (mostly documented) interacts with Content Builder"," and all of its assets."," • Auth v1: https://{{subdomain}}.auth.marketingcloudapis.com/"," v1/rest (mostly documented) groups legacy authentication endpoints."," • Auth v2: https://{{subdomain}}.auth.marketingcloudapis.com/"," v2/rest (mostly documented) groups new authentication endpoints."," • Automation: /automation/v1/rest (undocumented) groups endpoints related"," to Automation Studio, including activities and automation objects."," • Contacts: /contacts/v1/rest (partially documented) focuses on Contact"," Builder capabilities and interactions."," • Data: /data/v1/rest (partially documented) focuses on Salesforce data, data"," extensions, contacts, attribute groups and more.","\f208 The Power of In-Step APIs"," • Email: /email/v1/rest (partially documented) concentrates on lists, filters,"," and subscribers, and not really on emails."," • Guide: /guide/v1/rest (partially documented) focuses on Email Studio"," capabilities and lets you gain access to some things that normally were only"," accessible via the SOAP API."," • Hub: /hub/v1/rest (partially documented) is a fairly widespread grouping."," It covers areas such as Campaigns, Contacts, Attribute Groups, Data Extensions,"," Email Previews, Tags, and more."," • Interaction: /interaction/v1/rest (mostly documented) is an endpoint that"," is completely focused on Journey Builder capabilities."," • Legacy: /legacy/v1/rest (undocumented) is a treasure trove, but I would"," never look to use any of these in anything related to production. I would list this as"," high-risk, unstable, and confusing. This group also has some endpoints that require"," internal authentication to work – so, some of the endpoints are useless to us. That"," being said, there are so many amazing and useful endpoints in this group to explore."," • Messaging: /messaging/v1/rest (partially undocumented) is tied to the"," messaging options in Marketing Cloud. It includes capabilities to create, send,"," and track message definitions, including Email, MobileConnect, MobilePush,"," and GroupConnect, as well as some RMM interaction."," • Platform: /platform/v1/rest (partially undocumented) is mostly focused"," on account settings, token and endpoint settings, subscriptions, apps, setup, and"," Audience Builder."," • Push: /push/v1/rest (partially undocumented) is related to actions and"," capabilities in MobilePush."," • SMS: /sms/v1/rest (partially undocumented) is related to SMS sends and"," MobileConnect.","Now, to be fair, there are a couple of other experimental and internal endpoints that
I could list, but these will be of no real use for you as they either require authentication
beyond what an end user can gather or are so highly unstable that it's a risk to use them
for anything at all. With this background, we can see the huge benefit that utilizing these
API capabilities directly inside Marketing Cloud through CloudPages or Script activities.
Now, let's start looking at how we can utilize these API calls inside Marketing Cloud.
\f SFMC native functions and WSProxy 209","SFMC native functions and WSProxy","Let's look at the functions and capabilities we can use to call these API calls. Now, the","good news is that there is capability in both languages to do this. But I would very","highly recommend doing most, if not all of your API calls, inside SSJS as the capabilities,","functions, error handling, and parsing are so much stronger and more reliable than","AMPscript for this. That being said, let's explore the native functions that are built in with","an explicit HTTP method.","AMPscript
In AMPscript, there is a built-in capability to natively handle SOAP API sends (internal).
For the REST API, you can use the HTTP functions to make those calls. Outside of that,
the API capability is near non-existent as this is not the purpose that AMPscript was built
for. For now, let's dig into the functions and get a feel for how they are best utilized.","SOAP API functions","These functions had to be created because AMPscript has no native way to handle arrays","or objects, which is a big part of how the SOAP API is built. These functions can be","confusing to jump into, but they are very singular in focus, so once you understand how","they work, it is very simple to master them.","This functionality allows you to create objects and arrays, as well as utilize each of the","SOAP methods, but it is limited to Create, Delete, Execute, Perform, Retrieve,","and Update. One thing to note is that these are very verbose, so what may be a quick","couple of lines elsewhere could take 4 or 5 times as many when you're using","these functions.","A great example of this is the following script, which is used to send a Triggered Send","Email using AMPscript and the SOAP API:"," %%[
SET @emailaddr = \"[email protected]\"
SET @subkey = 'Sub123456'
SET @tsObj = CreateObject(\"TriggeredSend\")
SET @tsDefObj = CreateObject(\"TriggeredSendDefinition\")
SET @tsSubObj = CreateObject(\"Subscriber\")
SetObjectProperty(@tsDefObj, \"CustomerKey\", \"MyTriggeredSend\")
SetObjectProperty(@tsObj, \"TriggeredSendDefinition\", @tsDefObj)
SetObjectProperty(@tsSubObj, \"EmailAddress\", @emailaddr)
\f210 The Power of In-Step APIs"," SetObjectProperty(@tsSubObj, \"SubscriberKey\", @subkey)"," AddObjectArrayItem(@tsObj, \"Subscribers\", @tsSubObj)"," SET @tsCode = InvokeCreate(@tsObj, @tsMsg, @error)"," ]%%","This script utilizes the InvokeCreate function, along with the Object and Array
functions, to build out a SOAP API call and then execute it inside AMPscript. Next,
we will dive into the HTTP functions that are mostly used for the REST API.","HTTP functions
Now, technically, you could use the POST functions for SOAP if you wanted, but it would
be a lot of work. In general, the HTTP functions in AMPscript are used for REST API
interaction. In general, these are pretty self-explanatory in terms of what they do as the
HTTP method they use is part of the function name. So, I won't go into too much detail
here on each. Here is a list of the options:
• HTTPGet
• HTTPPost
• HTTPPost2","As you can see, this is pretty self-explanatory, except for HTTPPost2. The major
difference is that HTTPPost2 allows you to return a result that is not 200 without
throwing an exception, whereas HTTPPost does not. It also has a parameter value where
you can set a variable to return the status of the HTTP request.
The following is a quick example of using the HTTPPost2 function to POST data to an
outside REST API endpoint:"," %%[
set @payload = '{
\"ID\":111213,
\"FirstName\":\"Gor\",
\"LastName\":\"Tonington\",
\"TotalAmt\":125
}'"," set @postrequest = HTTPPost2(\"https://myAPIURL.","\f SFMC native functions and WSProxy 211"," com\",\"application/json\", @payload, true)","Now, I know this seems short, but that is because the native built-in capabilities of
AMPscript are very utilitarian and basic. This is not a bad thing – again, this kind of
scripting is not what the language was built to do. Concerning most needs around
messaging and content, these should allow you to make the necessary calls. Now,
let's explore SSJS and see whether that is any different in terms of native capabilities in
Marketing Cloud.","Server-side JavaScript
Note that I am not including special functions, such as WSProxy, as native capabilities and
instead concentrating on the Platform and Core capabilities. As we mentioned previously,
in SSJS, there are two different libraries with functions inside them. Each has different API
capabilities.","Platform
The platform is the more default type library for SSJS and in that way, it is very similar to
AMPscript in native functions. In that vein, the functions are identical for API usage as
they are in AMPscript. There are some nuances in how each functions, but in general, they
are the same. The major difference is that SSJS does not have an HTTPPost2 function
natively. There are some awesome capabilities related to API in Platform, but as I stated
earlier, they are not what I would define as native. To that extent, I am going to move on
to Core.","Core
Core has a couple of capabilities, but it is mostly filled with special made functions to
interact with and gather information from SOAP API objects without you having to make
the API call. This is the main function of the Core library. Outside that, it does not have
any way to interact with the SOAP API outside a completely manually built call that's
pushed inside of POST. Core only has two options when it comes to the REST API:
• HTTP.Get
• HTTP.Post","Each is a fairly simple version of the Platform or AMPscript versions. So, rather than go
into detail on these, I want to move on to what is, at least in my opinion, the best way to
interact with the SOAP API in-platform ... WSProxy!
\f212 The Power of In-Step APIs","WSProxy
WSProxy is a new functionality that was built into SSJS and is much more aligned with
the platform and simpler to use than the other SSJS methods. Note that WSProxy is native
to SSJS, but I did not include it as a native function because my intended meaning was
concerning the libraries, not just the language.
The difference in speed of this function in comparison to most other SOAP-related inline
capabilities is astounding. Not only that, but it natively uses JavaScript objects and JSON
for request and response, allowing for easy parsing and creation.
WSProxy allows you to create the arrays and objects that are needed by the SOAP XML
through JSON instead, which means that almost any SOAP API can be made using an
external service within the platform. The available SOAP actions are Create, Update,
Delete, Retrieve, Perform, Configure, Execute, and Describe.
Let's look at an example WSProxy call:"," //Creates new object(s) inside of the identified SOAP Object"," function createGeneric(soapObjName, contentJSON, mid) {"," //example soapObjName: \"DataExtension\""," //example contentJSON Object: { \"CustomerKey\": custkey,"," //\"Name\": name, \"Fields\": fields };"," //example contentJSON Array: [{ \"CustomerKey\": custkey,"," //\"Name\": name, \"Fields\": fields },{ \"CustomerKey\":"," //name, \"Name\": name, \"Fields\": fields }]"," //set default date (in a Data Extension) to"," //'Current Date': { FieldType: \"Date\", Name: \"Field2\","," //DefaultValue: \"getdate()\" }"," if(mid) {
prox.resetClientIds(); //reset previous settings
// Set ClientID
prox.setClientId({ \"ID\": mid });
}"," var batch = isArray(contentJSON);"," if(batch) {
var res = prox.createBatch(sopObjName,contentJSON);
\f Script.Util for the REST API 213"," } else {
var res = prox.createItem(sopObjName,contentJSON);
}"," function isArray(arg) {
return Object.prototype.toString.call(arg) ===
'[object Array]';
};"," return res;
}","This code shows an example execution of WSProxy. Now, this code can be simplified to
fewer lines, but what I provided here was a general-use SSJS function to perform
a WSProxy create method on any object or input you want to push, including batch
creation or impersonation.
Another great thing about WSProxy is that it utilizes your native authentication from your
user to make these calls, meaning you do not need to provide any authentication calls
or share any sensitive information when you're utilizing this function. Now, some may
say, But what if I want to interact with another business unit? Well, the good news is that
WSProxy allows you to impersonate different client IDs (requires permissions) and even
different users, but only from a parent to a child, not lateral or upward.
A good rule of thumb to use is that if you have to use the SOAP API inside Marketing
Cloud, then you should use WSProxy. As with all things, there are exceptions to that rule,
but most of the time, it is the correct choice to make. Speaking of the best ways to make
API calls, next, we will explore the Script.Util object.","Script.Util for the REST API","Now, as some of you may be saying to yourselves (or yelling it at this book), Hey! WSProxy","is based on the Script.Util object!, and you would be 100% right. That being said, though,","as we already went over WSProxy before this section, we will explore the rest of the object","as it relates to the REST API.","\f214 The Power of In-Step APIs","Even those who may be familiar with SSJS and the Platform library may be scratching
their heads and wondering what this crazy person is talking about. But I promise you it is
there in the documentation; it is just hidden inside the Content Syndication section. Inside
this section, three objects ARE listed:
• Script.Util.HttpResponse
• Script.Util.HttpRequest
• Script.Util.HttpGet","As these objects and functions are fairly unknown, I am going to be spending","a bit more time explaining them than I did the others. First, I will dig into the","HttpResponse object.","HttpResponse
This object is the return that's gathered from the send() method that's used in the other
two objects. Note that this cannot work independently of either of the other two objects.
As shown in the documentation, the following parameters are available for this object:"," • Content: A string value containing the HTTP response's content"," • ContentType: A string value indicating the content type that was returned"," • Encoding: A string value indicating the encoding that was returned"," • Headers: An object containing the HTTP response header collection that"," was returned"," • ReturnStatus: An integer value containing the Marketing Cloud response"," • StatusCode: An integer value containing the HTTP response status code that","Let's consider a couple of things regarding the parameters we just listed. For instance, the
Content parameter is returned as a common language runtime (CLR) object, which is
not compatible with Marketing Cloud SSJS. So, to get that converted, you need to turn it
into a string data type and then turn it into JSON format for easier parsing.
Another consideration is that certain business rules can affect your ability to use this
object effectively. If you get errors such as Use of Common Language Runtime
(CLR) is not allowed, then you will want to talk to Salesforce Support and your
account representative to turn on any associated business rules related to CLR.
\f Script.Util for the REST API 215","HttpGet
This object is used to perform GET from the specified URL and interact with HTTP
headers. Now, you may be wondering, how is this different from either of the other GET
functions in SSJS? Let me tell you: it is a good one. This function will cache content for
use in email sends. So, if you have the same content being pulled into an email, using this
function will provide the requested content more efficiently because when the second
email is processed at the end of the job, it will pull the content from the cache instead
of running the method again and hitting the URL.
This call is also set up quite a bit differently than the native functions. This is set up similar
to how you would fill any other object in SSJS, but you add properties via dot notation.
This is different than utilizing the native functions that add the parameters, similar to
JavaScript functions inside parentheses. Due to this type of setup, it does not run when
you create the object – you need to utilize the send() method to execute the call.
Let's take a quick look at the methods that are available for HttpGet:"," • setHeader(): This adds a header to the GET request via name-value pairs.
Adding a header will disable content caching.
• removeHeader(): This is a string value that shows the name of the header to
remove from the GET request.
• clearHeader(): This removes all the custom headers from the GET request.
• send(): This performs the GET request.","There are also some additional properties that we can use with this object:"," • retries: This is an integer that defines the number of times we should retry the
call. The default is 1.
• continueOnError: This is a Boolean that's used to indicate whether the call
returns an exception or continues. The default is false (it returns an exception).
• emptyContentHandling: This is an integer that defines how the return handles
empty content.","Now, let's look at an example of this code:"," var url = \"https://{{et_subdomain}}.rest.marketingcloudapis."," com/email/v1/rest\"","\f216 The Power of In-Step APIs","As you can see, it's a fairly simple call to make. But just because it is simple does not mean
there are not some amazing custom things you can do with this call. The next object
we are going to discuss is one of my favorite capabilities in SSJS as it allows unparalleled
access to API capabilities.","HttpRequest
HttpRequest is my favorite SSJS function, followed closely by WSProxy. Yes, you
read that right – I like this more than the legendary WSProxy. This is because this is
the gateway for connecting my two favorite things in SFMC – SSJS and the REST API.
HttpRequest is a powerful method that allows you to make calls inside Marketing
Cloud with a plethora of options, including setting the appropriate methods beyond just
GET and POST.
This object and HttpGet, to some extent, are similar to the xmlhttprequest
object in client-side JavaScript. Now, although similar, there are a ton of capabilities in
xmlhttprequest that are not available inside HttpRequest. For instance, a lot of
client-side interaction with the call, and using events such as onclick, are not allowed.
That aside, the general layout and parameter requirements are similar, so it's a good
reference to help if you are having trouble with getting a method or parameter to not
function correctly. As we mentioned previously, though, a lot of capabilities are not
shared, so if you are looking for things that are not documented on HttpRequest,
please expect a lot of failures.
As we mentioned earlier, the major drawback to utilizing this method is that it allows you
to access methods outside of GET or POST. This gives you full access to the Marketing
Cloud REST API library of endpoints within SSJS, removing the need to push these calls
to an outside service. Here is a list of methods that are acceptable to Script.Util.
HttpRequest:"," • GET
• DELETE
• HEAD
• OPTIONS
• PATCH
\f Script.Util for the REST API 217"," • POST
• PUT","As you may have noticed, these are all the available methods, meaning there is no real
limit to its capability there. Now, outside the HTTP method, the actual methods of this are
identical to those in HttpGet, as described earlier, so rather than duplicate, we can just
reference that previous section.
Another thing you may notice is that both HttpRequest and WSProxy seem to stem
from the same object – Script.Util. To this extent, I like to look at HttpRequest
as one of the precursors to WSProxy that helped pave the way for that amazing capability.
Now, that being said, I do feel these are two very different capabilities and are not
interchangeable, nor do they affect the effectiveness of each other. I view HttpRequest
as a way to accomplish REST API calls; WSProxy does this for SOAP.
Unfortunately, HttpRequest is not as maximized as WSProxy, so it is not always
as efficient. This means that it can run into issues when you're handling large volumes
or frequency due to timeouts or similar errors. But since you should not be doing any
heavy processing like this within Marketing Cloud, this should not come into play most
of the time.
Now that we have covered the history, let's explore a sample of what it looks like
(based on a Marketing Cloud REST call to the Content Builder API):"," ","Now that we've pulled in the JSON data and have it ready for processing, we need to
assign the relevant data points we highlighted in the payload to variables that we can
utilize for further processing.
\f An event-based example 357","Steps 2 and 3 – parsing the added array and retrieving the contents
Now, we'll grab the contents_url parameter from the payload. Notice, in our example,
the value in the payload is appended with the {+path} substring. We'll want to remove
this portion from our variable as it's not relevant for pulling the final path to the files that
we wish to retrieve. Finally, we'll also grab the added array from the commits property so
that we can iterate through each added file and retrieve its contents:"," var baseContentsURL = json.repository.contents_url;"," baseContentsURL = baseContentsURL.slice(0, baseContentsURL."," lastIndexOf('/') + 1);"," var addedFilesInCommit = json.commits[0].added;","That's all we need in order to accomplish the aforementioned items, and we now have our
variables assigned for the base content path URL as well as our added array. With that
in hand, we need to write our function to call the GitHub REST API to return the raw
contents of our newly pushed files. Let's take a look at what that script looks like and then
break down its components a little further:"," function getRawGithubData(assetPath, contentURL) {"," var accessToke = \"YOUR GITHUB ACCESS TOKEN\";"," var auth = 'token ' + accessToken;"," var url = contentURL + assetPath;"," var req = new Script.Util.HttpRequest(url);"," req.emptyContentHandling = 0;"," req.retries = 2;"," req.continueOnError = true;"," req.contentType = \"application/json\""," req.setHeader(\"Authorization\", auth);"," req.setHeader(\"user-agent\", \"marketing-cloud\");"," req.setHeader(\"Accept\","," \"application/vnd.github.VERSION.raw\");"," req.method = \"GET\";"," var resp = req.send();"," var resultString = String(resp.content);"," return resultString;","\f358 Webhooks and Microservices","As you can see here, we are using Script.Util in order to make a GET API request
to GitHub to retrieve our file content. To make this request, we'll need our function to
accept parameters for contentURL, which we assigned to a variable and formatted in
the previous step, and the path of the file that we'll pull from our added array assigned
previously as well. Before we can complete our API call, we'll need to further define the
following items in our request:"," • Authorization header: This allows us to authenticate our call into the GitHub API
to confirm that only we can retrieve the data relevant to an individual file. For this
header, we'll simply need to concatenate token followed by the GitHub personal
access token that we created and saved in the GitHub configuration portion of
this example.
• User-agent header: A user-agent header is a requirement on GitHub REST API
calls, so we'll have to pass a value for this header in our API call for it to function.
The exact value doesn't matter, but it should be reflective of the platform/purpose of
the call with which we are planning to execute. For our purposes here, we'll set this
value to marketing-cloud.
• Accept header: We will specify this header to let GitHub know that we want to
return the raw data of the file in the request-response. This allows us to utilize the
exact contents of the file without any further processing or decoding on our end.","That's all that we need to define to make our request in GitHub in order to retrieve the
file contents of whatever asset path we pass into this function. We'll make our request
and return the content of that request as an output of the function so that we are able to
retrieve the file contents and upload the asset to Marketing Cloud. With our function set
up to retrieve the contents of the files added during the commit, we'll now need to write
our function that writes this content to Marketing Cloud.","Step 4 – creating new content","While we could utilize several methods in order to create this content, such as the","Content Builder REST API, for ease of use (and to save us from setting up packages and","authenticating into Marketing Cloud), we'll use a platform function approach to creating","this content. Before we dive in, it's important to note that the documentation outlining","the possible routes and functionality within Content Builder can be found in the official","documentation located here: https://developer.salesforce.com/docs/","marketing/marketing-cloud/guide/content-api.html.","\f An event-based example 359","Let's take a look at what that function looks like before outlining what's going on:"," function createAsset(assetName, assetContent, assetId,"," assetCategoryId) {"," var asset = Platform.Function.CreateObject(\"Asset\");"," var nameIdReference ="," Platform.Function.CreateObject(\"nameIdReference\");"," Platform.Function.SetObjectProperty(nameIdReference,"," \"Id\", assetId);"," Platform.Function.SetObjectProperty(asset, \"AssetType\","," nameIdReference);"," var categoryNameIdReference = Platform.Function"," .CreateObject(\"categoryNameIdReference\");"," Platform.Function.SetObjectProperty("," categoryNameIdReference, \"Id\", assetCategoryId);"," Platform.Function.SetObjectProperty(asset, \"Category\","," categoryNameIdReference);"," Platform.Function.SetObjectProperty(asset, \"Name\","," assetName);"," Platform.Function.SetObjectProperty(asset, \"Content\","," assetContent);"," Platform.Function.SetObjectProperty(asset,"," \"ContentType\", \"application/json\");"," var statusAndRequest = [0, 0];"," var response = Platform.Function.InvokeCreate(asset,"," statusAndRequest, null);"," return response;","Here, we are outlining a function called createAsset that will take some parameters
and utilize them to actually create our code snippet content block within Marketing
Cloud. Our function should accept parameters for the following properties of our Content
Builder asset:"," • Asset type ID
• Category/folder ID
\f360 Webhooks and Microservices"," • Name
• Content","First, we'll need to define the asset type that our content belongs to. While we have written
our function to make this process generic, we could also hardcode it directly if we are only
utilizing this webhook to process data for a given type. Here, we'll let the function take it
as a parameter and assign the type ID according to that. Next, we'll need to retrieve the
categoryId parameter and define that value for the Category ID property of our
asset initialization. This ID will specify exactly what folder we wish to insert this asset
into. Finally, we'll grab both the asset name and content parameters and then assign them
accordingly to our asset object. Then, our function will create the asset with the values
defined previously and insert this content into the specified folder within Content Builder.
Now, all that we need to do is iterate through the added items in the GitHub JSON
payload and invoke the preceding two functions to retrieve the content and create it in
Marketing Cloud:"," for (var i in addedFilesInCommit) {"," var assetPath = addedFilesInCommit[i];"," var categoryId = assetPath.substring(0,"," assetPath.indexOf(\"/\"));"," var contentName ="," assetPath.split(\"/\").pop().replace(\".html\", \"\");"," var contentData = getRawGithubData(assetPath,"," baseContentsURL);"," createAsset(contentName, contentData, 220, categoryId);","Notice here, we are iterating through each item in the array and then assigning an
assetPath parameter that will equal the path of the file that has been pushed to
our GitHub repository. Because this path contains both the name of the file and the
category ID, as defined in the naming convention we discussed at the start of this example,
we'll want to parse out each of those values separately from the added array item within
each iteration. Finally, we'll invoke our GitHub REST API call function and assign it to
a variable that will now contain the raw content of the file we've retrieved. After that,
it's as simple as calling our createAsset function, noting that we are passing in a value
of 220 for our asset type ID as this corresponds to code snippet content blocks within
the Content Builder API asset model. For a complete list of asset type IDs, please refer
to the Content Builder Asset Type documentation, located at https://developer.
salesforce.com/docs/marketing/marketing-cloud/guide/base-
asset-types.html.
\f An event-based example 361","That's it! With the preceding code saved and published to the endpoint that we defined
within the GitHub webhook configurations, we are now all set in order to start syncing
our push event file data with Content Builder. Whenever we've added, committed, and
pushed an event to our repository, this endpoint and logic will automatically be processed
and the new content will be created within Marketing Cloud from the data we've pushed
to the repository.
This was a somewhat simplistic example, but I hope that it helps highlight the different
ways that we can utilize webhooks to create an event-driven set of functionality within
Marketing Cloud that automates our processes or provides some new degree of efficiency.
Utilizing the aforementioned solution, we could easily scale it to more comprehensively
handle our assets or even define our own schema for mass creating any set of Marketing
Cloud objects that are accessible either through platform functions or API routes defined
within the documentation.
In addition to using external services in order to generate event-driven functionality,
there are webhook services within the Marketing Cloud ecosystem that allow the user
to subscribe to certain events and then receive automated requests posted to a defined
endpoint whenever an activity has occurred. One such method is Event Notification
Service, which provides a webhook functionality that allows developers to receive relevant
deliverability and engagement metrics on an email or text deployment automatically.
This allows us to further automate processes in order to provide an immediate response
or insight following some user engagement with our content. So, say we have an order
confirmation email that contains a link for support or further information. We could set
up a webhook that receives a request when the link is clicked and then takes some further
immediate action (such as emailing a more detailed order receipt to the user).
The core concepts for utilizing the GitHub integration and Event Notification Service
remain largely the same. Though the steps to accomplish both will differ in their
configuration or endpoint, the basic premise is that we utilize the following steps to create
our integration:"," 1. Configure an endpoint to receive requests."," 2. Register and verify that our endpoint is able to securely authenticate and process"," calls from the webhook."," 3. Create the webhook subscription event such that an event will be fired to"," our provided endpoint whenever a defined action has occurred.","\f362 Webhooks and Microservices","With Event Notification Service, these steps are largely generated through API requests
to defined routes within the Marketing Cloud REST API. In our GitHub example, these
are done through simple User Interface (UI) configurations made within the repository
settings, but the overall flow necessary for constructing these solutions is essentially
the same.
Understanding the importance of event-driven requests, and how they can be utilized
both within Marketing Cloud and externally in order to generate real-time functionality,
is key. Familiarity with the distinction between webhooks and APIs allows us to choose
the appropriate tools for a given use case and allows developers to select the appropriate
tool for a given task and ensure that we're keeping efficiency and maintainability at the
forefront of our application development. Now that we have introduced the concept of
webhooks, let's move on to another concept that can aid us in building solutions that are
both efficient and can scale.","Microservices, assemble!
It's no secret to any of you that business requirements or existing flows can change,
sometimes on a daily or weekly basis. As such, development teams are compelled to
adapt to changing circumstances by extending new functionality into a given service
or by altering its capabilities to meet both existing and new challenges. Unfortunately,
it's not always so simple to extend functionality or revise existing solutions within our
applications. We may have portions of our code base that are generic but intertwined and
dependent on the execution of some other component.
When starting a project, when the focus is narrow, the code base can be very manageable
and somewhat self-contained since it should encapsulate all of the base functionality
outlined in the discovery process. Over time, additional functionality and components
are added such that the code base and build, integration, and test processes can become
cumbersome to manage or decouple. With more and more code being utilized in a central
location, best practices were developed for ways to modularize the functionality of the
application to make the code more maintainable and generalized (that is, they can be used
by other parts of your application). Unfortunately, all of these individual modules must
still be compiled together into a single code base in order to deploy the application. So,
regardless of how the improved modularity of the application has impacted the developers
working on it, at the end of the day, it still needs to come together in a single deployment
for the entire code base to go to production. Enter microservices.
\f Microservices, assemble! 363","Microservices are an architectural pattern that differ from a monolithic approach","in both the structure of the development process, as well as that of deployments. In","a microservice architecture, we break down individual pieces of functionality into","discreet, loosely coupled entities that have their own code base and can be deployed","and managed independently from the rest of the application. So, when we have a simple","update or new service addition, we can both develop and deploy the individual piece","of functionality as a separate code base rather than worry about app-wide testing and","deployments or integrations. Before we clarify this topic any further, let's take a look at the","monolithic approach for building applications and then compare that with a microservices","architecture so that we can see the pros and cons of each:"," Figure 12.4 – Monolithic architecture diagram","A monolithic architecture is used for traditional server-side applications where the","entire functionality or service is based on a single application. The entire functionality","of the site is coded and deployed as a single entity and all dependencies are intertwined","together. As you can see in Figure 12.4, a monolithic architecture comprises a UI,","a server-side application (Business Logic and Data Access Layer in the preceding","figure), and a database that contains relevant information that we can read and write with","our application. Now that we have a base definition of what a monolithic architecture","comprises, let's take a look at some advantages and disadvantages of this architecture.","Advantages of monolithic architecture","This is the architecture that most developers in Marketing Cloud will be familiar with","concerning application development. A single suite of tools or technologies is selected","to solve a given range of use cases, and the development process will more or less flow","through a common build and deployment process that will be global in its management of","the code base. Not only is this process intuitive, particularly when coming from a hobbyist","or more isolated developer experience, but it can also allow developers to get started on","a project quickly.","\f364 Webhooks and Microservices","First, let's look at some of its advantages:"," • Simple to develop: Because this is the traditional method of developing"," applications, it's quite likely that your development team feels comfortable utilizing"," this architectural pattern for your application. In addition, when fleshing out your"," workflow and desired functionality for the application in the planning stages,"," it is much simpler to structure and build your application from the ground up in"," a monolithic architecture that allows for code reuse and shared datasets. Separating"," your code logically into components that are still related within a single application"," can introduce the concept of modularity without having to build individually"," separate services."," • Simple to deploy: This might be the most obvious benefit of utilizing a monolithic"," architecture. Simply put, it's much easier to stage and deploy a single code base than"," to manage multiple directories or files. Rather than worrying about varying build"," processes or service quirks, it's all rolled into one package that you can put into"," production at once."," • End-to-end testing: It should come as no surprise that end-to-end testing of"," your application or service is much easier when the entire suite of functionality is"," hosted within a single code base. There are many tools out there that can automate"," our testing procedures much more easily when our application is unified within"," a single code base.","Disadvantages of monolithic architecture","As you can see, some of the most key advantages of utilizing this architecture are related","to its simplicity to develop, test, and deploy. Most developers will be familiar with","this workflow and easily understand the efficiencies that it can provide, particularly","during the initial build and deployment process. That being said, this is not without","a few disadvantages as well. Let's take a look at a few key costs when implementing","this approach:"," • Changes can be complex: Utilizing a single code base to provide all of the features
of your application or service can become very difficult to manage when the overall
size and complexity of your code are significant. If we have an individual feature
or component to develop or extend, we are unable to isolate that code individually
from the other components in our code base and we must test and deploy the entire
application as a single entity just to accommodate this change.
\f Microservices, assemble! 365"," • Scalability: Let's say we have developed a suite of services that will automate
business tasks in our organizations in addition to providing some functionality
related to customer experience (an API gateway for email data, for example). Some
of the functionality is used quite rarely, while others are much more in demand and
receive lots of traffic each day. We could implement elastic scaling of our services
so that the servers can process spikes in traffic and allocate more resources when
many requests are being made simultaneously. Unfortunately, with a monolithic
architecture, we can't selectively scale the individual portions that may receive the
most traffic since the entire code base is effectively a single, coupled entity. This
means we have to scale the entire application, even though only a small handful
of components might require it. This can lead to poor user experiences or costly
resource use that could be avoided with a more decoupled architecture.
• New technology barrier: When we use a monolithic architecture, decisions about
the technologies to utilize need to be made as part of the overall discovery process.
As requirements change, and new languages, tools, or services are created to more
efficiently handle common development issues, we may want to implement
or utilize these new technologies to more efficiently deliver our features or to
provide some capability that isn't supported in our current implementation.
Utilizing a monolithic approach, we may have to rewrite large portions of our
application to support this new technology, which might not be feasible from
a time management or financial cost perspective.","As you can see, there are some obvious advantages and costs associated with utilizing
a monolithic architecture when building applications or services. While it may be more
intuitive to use this approach, and even desired when the level of complexity in the
application is known to remain small, these advantages come at the cost of maintainability
and scalability, which may be substantial barriers when considering your implementation.
\f366 Webhooks and Microservices","Let's now take a look at an alternative approach that was created to address some of
these concerns:"," Figure 12.5 – Microservice architecture diagram","As you can see from the preceding figure, the structure of this architecture differs","substantially from a monolithic approach. Instead of the more linear flow that we","outlined with a monolithic approach, here we've decoupled our services and routed them","through an API gateway. This allows us to route the appropriate request to individual","microservices, providing a level of service decoupling that is not possible in the other","architecture. For clarity's sake, let's define what each of these pieces does at a high level:"," • Client: The client can be any type, including a mobile application, single-page app,
or integration services. It's essentially how a user or service interacts with
your application.
\f Microservices, assemble! 367"," • API gateway: An API gateway is a sort of reverse proxy that sits between the
client and your application microservices. It routes requests from the client to the
appropriate microservice needed to perform some action. This is not a required
entity for a microservices architecture as your client could call the necessary
microservices directly, but it can sometimes be more efficient to utilize one
(or multiple) API gateways that can route requests more efficiently or offer
additional features, such as authentication and caching, that might not be easily
implemented in a direct communication pattern.
• Microservice: As the name implies, microservices are small, independent pieces
of functionality that are maintained in their own separate code base and are
deployed individually from other microservices and features of an application.
They will generally be grouped by the domain that they fall within (such as order
management, shipping information, and cart functionality) and are accessed
utilizing simple requests.
• Database: This is the actual datastore for the microservice, and it is the database for
the information processed by the service.","As you can see, microservices differ from a monolithic approach in some distinct
and important ways. First, it decouples functionality by their domain or purpose into
entirely separate code bases, languages, deployments, and build processes. From an
end user perspective, the functionality of a monolithic and microservice application
is essentially the same but the method with which that functionality is built is quite
different. In the monolithic approach, we're taking all our disparate features and services
and rolling them up in a single application that a user interacts with. With microservices,
however, we separate our application into a subset of smaller applications that interact
with our application in such a way that the overall suite of services mirrors what our
single application could do but in a much more efficient and manageable structure for
developers. To illustrate this difference a bit more, let's list some characteristics of
a microservice that define its purpose and how it can be managed:"," • Microservices should be small, loosely coupled, and managed independently of"," other services in the application. Each service should use a separate code base."," • Each service can be created, managed, and deployed independently of any other"," service within the application."," • Services should communicate with other services utilizing well-defined APIs,"," though the internal implementation details of a given service are hidden from"," other services."," • For the most part, each microservice will have its own private database that it"," maintains separately from other services.","\f368 Webhooks and Microservices","The key takeaway from these points is that each service is its own self-contained entity
that can be managed wholly separate from the other services comprising an application.
This lends itself well to extending the functionality of your application as you can have
disparate teams contributing to the same overall functionality while still retaining
the overall architecture of your implementation. Whether you are using completely
different technologies or languages, hosting platforms, or any other differing item in the
development process, as long as you have a common set of instructions for accessing and
processing data from the service, they can be implemented within the same architecture to
drive the functionality of an application.
Now that we have examined some of the advantages and disadvantages of a monolithic
architecture, let's take a look at the microservices model in order to determine the pros
and cons of its implementation.","Advantages of microservices architecture"," • Loose coupling: Since each microservice is essentially its own mini-application, the
risk that a change in one portion of the application will cause unanticipated changes
in another is minimized greatly. Not only does this allow us to more easily maintain,
test, and troubleshoot individual pieces of functionality of our application,
but it also prevents a single errant piece of code in one part of our application
from causing a widespread outage. This also allows us to provide more granular
monitoring for individual components of our application rather than a more
global service.
• Technology flexibility: We've highlighted this previously, but it's an important
advantage when using this architecture. Because each of our services is maintained
and deployed individually, no individual component is tied to the overall
technology stack of any other service. This allows us to more easily upgrade
or implement functionality utilizing the most up-to-date tools and technologies,
which may provide a substantial benefit when compared to the initial
implementation decisions during project discovery. Additionally, it widens the
scope of the teams that can work on functionality for an application since it allows
developers to work on their own piece of functionality in isolation from the
application in whatever technology stack they feel most comfortable with.
\f Microservices, assemble! 369"," • Encourages refactoring: When you've got an application that has become highly
complex, and where functionality for different items can have interdependent
relationships with the same sets of code, it can be discouraging to rewrite
or refactor your code base. The adage if it isn't broke, don't fix it is commonly
used in this context as the benefits that we might derive from improving our code
can sometimes conflict with the costs of testing, maintenance, and downtime if
the code in question is shared across the application. Microservices allow us to
more granularly improve specific, self-contained sets of functionality without
as much worry that our refactoring will have unintended consequences. This
encourages developers to continuously refine their implementation to make it more
understandable, efficient, and maintainable.","Disadvantages of microservices architecture","These are the disadvantages:"," • Lack of governance: We shared that technology flexibility and loose coupling are
key advantages of utilizing this architecture, but they also can become an issue as
well. While having the freedom to implement new technologies or languages for
each individual service in our application allows us to expand the scope of people
who can contribute and ensures that we can apply more efficient technologies more
easily, it can come at a cost if done too frequently. Since there is no centralized
framework with which each service is developed (though there can be a business
requirement), you may end up with so many different languages and frameworks
being used that some services become unmaintainable. Implementing the flavor-of-
the-month framework for a given service might seem great at the time, but could be
a niche item or unmaintained tool before you know it.
• Complexity: While it's true that we have offloaded a lot of the complexity of each
individual service into its self-contained code base, we've also introduced quite a bit
more complexity into the system as a whole. More complex integration and routing
can introduce complexities that are implicitly handled by the likely more simplified
routing present within traditional monolithic applications.
• Data consistency: With each microservice using a private datastore and responsible
for its own data persistence, it can be difficult to derive data consistency across your
application. While there are services that can help manage this, and even different
application patterns specific to this issue, it's a common concern when utilizing
microservices in data-heavy applications.
\f370 Webhooks and Microservices"," • Latency and congestion: Because our services need to communicate directly"," with the application or other services, we can introduce congestion or latency in"," our network if the methods within our services are highly dependent or poorly"," structured. For instance, if we have a service A, which calls service B, which then"," calls service C, and so on, we can incur significant latency that will affect the overall"," user experience or even the general functionality of our application.","Each implementation comes with its own set of benefits and challenges, and the type
that you choose to implement will be based on a multitude of factors, such as the
complexity of your application, the expected roadmap for functionality, and the scope of
development resources available. Though the benefits of microservices are clear, it is often
recommended that, unless constructing complicated enterprise applications, you utilize
a monolithic approach to begin with. This is so that you can more quickly get an idea
into production and determine whether the future needs of your application merit the
implementation of a microservices architecture. Segmenting your code into more modular
components, simplifying build and deployment processes, and keeping the data model
more self-contained are all ways that you can build within a monolithic structure while
still keeping your application flexible enough for an eventual pivot to microservices if the
move seems warranted. Finally, microservices are not necessarily better. If making simple
changes to your application requires you to update and deploy 5-10 different services,
then it defeats the purpose of using this architecture. It's simply a method for managing
complex functionality within an application when the logic can be easily decoupled and
managed by multiple teams using their preferred technologies.
Outside of understanding these two approaches concerning application development,
there is a benefit in considering these architectures and their cost/benefit even among
constructing simple functionality within Marketing Cloud. For instance, let's consider
we have a simple automation that contains some simple scripts that will create data
extensions, queries, or journeys. We could write a single script that reads some input,
from a data extension perhaps, and then uses that data to determine which function in
the script to execute (create a data extension or create a query, for example), but that
doesn't feel like an efficient method for implementing this solution. For one, we now have
multiple, unrelated pieces of functionality being housed within the same code base, which
makes it more difficult to maintain and could lead to a small error in one piece effectively
derailing the entire script. A more efficient solution would be to have each script separated
to only handle the domain that is relevant for its functionality. In this instance, we might
have a script for creating data extensions, one for creating queries and another for
creating journeys.
\f Summary 371","By compartmentalizing the individual pieces into their own, distinct Script activities,
we've created a system where single errors in one script have little to no impact on
our other scripts and allow us to make updates more selectively to individual pieces
of functionality rather than constantly tweaking a single Script activity to manage all
pieces. Now, you might be hard-pressed to consider this implementation a true example
of a microservices architecture as it is traditionally understood within web application
development but a lot of the same benefits can be derived by utilizing this system as
with microservices. Obviously, the understanding of this in the web application space is
hugely beneficial for us as Marketing Cloud developers as well as when we are building
complex applications that interact across services to automate some functionality within
Marketing Cloud. That being said, I hope that the takeaway from this chapter for you has
been that you can utilize these generic concepts, with regard to both microservices and
the other topics we've discussed so far in the book, in order to start thinking differently
about how your work in the platform itself is done. While you may not always find a quick
correlation with the work you're doing on a daily basis, understanding these architecture
patterns will inform how you operate within Marketing Cloud and can allow you to
approach problems from a more knowledgeable perspective that will drive efficiency and
maintainability in your solutions.","Summary
We've covered several different key ideas within this chapter that I hope you found both
informative and enlightening for how you consider the work that you do as a Marketing
Cloud developer. The differences between webhooks and APIs, and how we can utilize
webhooks to create an event-driven, real-time solution, are so important in taking your
integration with Marketing Cloud to the next level. As we have seen the rise of many
platforms and services that implement webhooks, such as GitHub, Discord, and Slack,
there have arisen numerous opportunities for automation across disparate systems to
allow functionality that would otherwise be either impossible or wildly inefficient.
In addition to discussing webhooks, we also went through an example that creates
content whenever a push event has occurred within GitHub. Obviously, our example
was somewhat simplistic with many assumptions made for ease of demonstration, but it
should provide a strong springboard for you to take this functionality to the next level. As
Git has become an indispensable tool for teams across all development spaces, integrating
this technology with Marketing Cloud through automated services can be a powerful
multiplier that will increase the efficiency and happiness of developers or marketers
working within the platform.
\f372 Webhooks and Microservices","Finally, we reviewed what microservices are and how this architectural pattern differs
from the traditional monolithic approach to application development. We highlighted
some of the advantages and disadvantages of each approach and carefully considered
how different factors, such as application complexity, team capabilities, or modularity,
can affect our decision in regard to the optimal solution for our given use case.
We also took a step back to consider how these ideas could be envisioned in the
context of Marketing Cloud automation, and how automation itself can be thought
of as a microservice architecture.
After reading this chapter, you should feel inspired to create event-driven functionality
that can create value for your organization or developer experience. You should also be
able to more clearly see how to apply the concepts in this book to the work that you do in
Marketing Cloud, even outside of the context of custom application development.
In our next chapter, we're going to tackle custom Journey Builder activities, specifically an
activity that can greatly expand both the utility and capabilities of Journey Builder within
Salesforce Marketing Cloud.
\f 13
Exploring Custom
Journey Builder
Activities
In the last chapter, we highlighted some of the possible use cases and advantages of using
event-based architecture. In addition to covering the concept of webhooks, including how
they differ from a traditional API, we also outlined a sample use case that allows users to
set up a GitHub webhook in order to automatically sync content between a repository and
Content Builder in Marketing Cloud. Utilizing event-based architectures such as what we
just described is important to allow us to create real-time functionality rather than having
to utilize concepts such as continuous polling to ensure that data or the resource state is
synced between two systems.
In this chapter, we're going to examine another concept around event-based functionality
that is specific to Marketing Cloud, known as Journey Builder custom activities. You
are likely familiar with the standard suite of activities available within Journey Builder for
building functionality that can action on individual user data in real time, and indeed,
we've outlined some of that functionality and activity here in an earlier part of the
book. Similar to this, Journey Builder also offers us a framework for building our own
functionality and integrating our services directly within the Journey Builder framework.
\f374 Exploring Custom Journey Builder Activities","In this chapter, we're going to learn about the following topics:"," • An overview of the required files and structure for Journey Builder: In this
section, we'll dive into the requirements and basic needs for our custom activity.
• Setting up your package and environment: We dive deeper to set up and build our
environment for the custom activity.
• Exploring configuration and activity UI: This is the setup of the structured logic
and interactivity aspects of the custom activity.
• Using the postmonger events and payloads: This is where we deal with payloads,
requests, and responses for our custom activity.
• Implementing handlers and authentication: These are the final touches to secure
our custom activity and to utilize the logic and scripting that is needed for our
intended results.","Through these topics, we will be able to build our very own example custom Journey
Builder activity and will add some very powerful capabilities to your automation
capabilities in relation to Marketing Cloud. However, before we get onto these topics,
first, let's take a look at the technical requirements that you will need to follow along with
this chapter.","Technical requirements
While we will go through important aspects of the code base in this example, it is
expected that you have some rudimentary knowledge of the Node.js environment in order
to understand the content of this chapter. In addition to this, it is recommended that
you have some knowledge of jQuery, which we will also utilize within our activity, along
with a basic familiarity of how to utilize the GitHub Command-Line Interface (CLI) for
management and deployment. In addition to this, we will be utilizing the hosting service,
Heroku, to host and serve our application that integrates with Journey Builder. Heroku is
a Platform-as-a-Service (PaaS) cloud that allows us to build, host, scale, and deploy our
application, which will interact with Journey Builder to deliver end-to-end functionality.
Before getting started, let's take a high-level view of how the overall flow will work in
our example."," Figure 13.1 – The custom activity flow","\f An overview of the required files and structures for Journey Builder 375","As you can see from the preceding diagram, Journey Builder will broadcast events to our
custom activity, which will then process the data sent in the event data and make a POST
request to a Code Resource page in order to log the relevant data into a data extension.
These events will fire when any kind of activity occurs in Journey Builder, such as editing
the activity configuration or a contact reaching the activity within a running journey. In
those events that involve the initialization or configuration of our activity, the custom
activity will process the logic we have defined for those events and then send a response
back to Journey Builder containing details of the configuration. However, when an
actual record reaches this activity in a running journey, the custom activity will process
encrypted production data from Journey Builder and then request the code resource page
to execute the logic on that page driven by the data posted from our activity. From there,
the code resource will post a response back to the activity (such as 200 OK), and then our
activity will return a status of the execution back to Journey Builder, and the record will
either proceed in the journey or not. With that high-level overview out of the way, let's
dive into the required files and structures of the activity.","An overview of the required files and","structures for Journey Builder","When a contact enters this custom activity at a given juncture in the journey, we're able","to access their data programmatically and take some action, as defined by the structure of","our application integrating with Marketing Cloud. Let's imagine that we have a complex","customer journey that contains several touchpoints within our journey that interacts with","customers utilizing email, text, or push communications.","We've got a great direct mail service that we've been using in our ad hoc campaigns, and","we'd like to utilize that service to automate our mailers whenever someone has reached","a specific point in our journey (say, after a recent purchase). While we could build an","automation or scheduled service to do this, these services do not operate in real-time","and might not be adequate for reaching our customers as quickly as possible in order to","generate conversions.","Here, we could configure an application external to Marketing Cloud that is set to receive","data from our journey entry source and utilize this to automatically create our mailer,","utilizing our direct mailing service API. Then, we can integrate that custom service","directly into Journey Builder. This is so that it becomes a configurable activity on the","canvas, and we can easily add to any part of the journey that we'd like.","\f376 Exploring Custom Journey Builder Activities","This is only a sample scenario, but it's easy to envision the myriad of ways that you could
implement this kind of system to accomplish real-time activities that reach a customer
in an automated fashion exactly when you want them to. Aside from a mail service,
we could create custom decision splits that action on data external to Marketing Cloud,
coupon lookup and claiming services, query internal data stores for further customer
personalization, and many other possible services and integrations that meet your
organization's needs.
So, we've got a basic idea of what a Journey Builder custom activity is, but how can we get
started building our own? In this chapter, we will create a custom activity that will post
a simple payload to a CloudPage as a proof of concept in order to become more familiar
with how we can extend Journey Builder to provide enhanced functionality with
custom activities.
We'll utilize an existing repository as the base for our application in this example,
but we could construct our functionality in many different ways depending on our
experience, capabilities, and familiarity with various languages or technologies. Although
there is some degree of flexibility in how your application can be constructed, it must
contain the following items in order to function within Marketing Cloud:"," • index.html: This will serve as the frontend of your application and will be the
user interface that users see whenever they configure an instance of your application
within Marketing Cloud. This page will be iframed as the configuration menu
and should contain any relevant user inputs or configuration settings that you
want to expose to Journey Builder users. This file must live in the root directory
of your application.
• config.js: The configuration file will provide Journey Builder with metadata
about your application, such as the size of the configuration window, data that
you want to pass from the journey to your application, the general flow of your
configuration menus, and more.
• postmonger.js: This is a JavaScript utility that allows us to enable
cross-domain messages for our application. This allows our application and
Journey Builder to communicate through a set of predefined events that allows
our app to programmatically access and trigger the functionality that is necessary
for our application to function.
• require.js: This is used to manage dependencies between different
JavaScript files within our application and will help to load items such as our
postmonger file.
\f An overview of the required files and structures for Journey Builder 377"," • customActivity.js: This file will house the Postmonger events that we want
to initiate within our application or Journey Builder, as well as the primary store of
functionality for our frontend.","Now we have some idea of the critical files that are necessary for our application, but how
does it function, on a base level, within Journey Builder? Let's examine the overall flow of
how a custom activity will function within our journey. First, when our application has
been integrated with Journey Builder and is available as a configurable activity on the
canvas, our config.js file will be read by Marketing Cloud to determine the name of
our application, the icon to display in the activities menu, and other metadata about our
application to allow the user to interact with it.
When a user drags our activity onto the canvas and opens the configuration menu,
Journey Builder will iframe in our frontend, which will contain the user input necessary
for any custom configurations that are needed to accomplish our functionality (in our
example, this is the CloudPage URL). When this item is being loaded, the Postmonger
events are triggered to execute certain pieces of functionality, such as retrieving the
existing configuration for an instance of an activity, and any relevant functionality that is
related to our load functions is processed.
Conversely, when the user has completed the configuration for an instance of the activity
and has clicked on the Save button to save the configuration, the data relating to that
instance of an activity is stored within the inArguments part of our payload and is
accessible by our application anytime that activity is executed.
So, with our activity initialized and configured, a user enters the journey and reaches
our custom activity. When this occurs, Marketing Cloud will send contextual data about
the configuration of the activity, along with any relevant user data that we have elected
to capture from our customer, as encoded JSON in a POST request to our application.
The application then decodes the JSON by utilizing a JSON Web Token (JWT) signing
secret, which is provided in your Marketing Cloud app configuration. This is able to parse
the data for further processing of the functionality. Following this, the application sends
a response back to Journey Builder, letting it know the status of its execution, and then
custom proceeds along the journey to the remaining activities in the flow.
That is a high-level overview of the required files and overall structure being utilized by
Journey Builder and our application in order to communicate and provide enhanced
functionality as defined by our application. So, with that information in hand, let's dive in
and start setting up our application!
\f378 Exploring Custom Journey Builder Activities","Setting up your package and environment","Before we can begin building our sample application, we'll need to configure some items","upfront to ensure that we have a solid base with which we can build. First, we'll need to","select a method for hosting our application, along with the types of technologies that","we will utilize to construct our application. For this example, we're going to utilize Heroku","to host and scale our application. Heroku is a PaaS cloud that can enable us to host,","deploy, and monitor our application in an intuitive way that ensures we can get started","quickly and easily troubleshoot our application.","Also, while we could utilize any number of programming languages or frameworks to","build our activity with, for this example, we will be creating a Node.js application to serve","as our custom Journey Builder activity.","With those caveats out of the way, first, let's take a look at how to create your","application within Heroku and link it to a GitHub repository that we will use to","manage our code base.","Setting up your application in Heroku","We'll execute the following steps to get this setup going:"," 1. First, navigate to the Heroku application menu, and select Create new app to create
our application."," Figure 13.2 – The Heroku app creation menu"," 2. Then, we'll simply give our application a unique name and create our application"," in Heroku."," 3. Now, we need to configure our application to pull its source for a central Git"," repository and utilize this as the deployment pipeline for our application. Note, in"," the preceding screenshot, that we can also utilize the Heroku CLI to manage our"," deployments and code base, but for the purpose of this example, we will utilize"," GitHub. Whenever we commit and push a new change to the master branch of our"," GitHub repository, our application should automatically update and publish the"," new changes for Journey Builder to reference.","\f Setting up your package and environment 379","4. To do this, first, we'll need to initialize a repository that contains all of the
necessary code to run our custom Journey Builder activity application. For ease
of demonstration, let's clone the sample GitHub project repository for custom
activities available within the Chapter13 subfolder of the following repository:
https://github.com/PacktPublishing/Automating-Salesforce-
Marketing-Cloud. This will give us a solid base to work from and ensure that
we can demonstrate the functionality of our custom activity much more quickly.
5. With our sample project structure cloned to local storage, we should now create
a new repository for our application within GitHub, and set our origin for this
application to the new remote URL of our repository by utilizing the following:
git remote set-url origin new.git.url/here","6. Now, all we need to do is push our local repository to GitHub with Git push and
we're in business! We've got our repository configured and a boilerplate template
ready. So, let's go back to Heroku to integrate our new Git repository into our
application. Under the deploy section of the Heroku interface, navigate to the
Deployment method section of the menu and select Connect to GitHub to link
our newly created GitHub repository to our Heroku application."," Figure 13.3 – The GitHub connection menu","7. We can utilize a simple search feature to find our repository, after authenticating it"," into GitHub from Heroku, and then select connect. Now our application source is"," linked to our repository!","\f380 Exploring Custom Journey Builder Activities"," 8. From here, we can then choose to deploy our code base automatically whenever
a new commit has been pushed to an individual branch. As you can see in the
following screenshot, we'll set our configuration to automatically deploy our code
from GitHub to our application whenever a commit has been pushed to the master
branch of our repository. This ensures that our application is always up to date with
the newest source from our repository."," Figure 13.4 – The automated deployment menu"," 9. Now, with the automated deploys enabled, all that we have to do is push any"," changes, and our application is updated. To see our application in action, scroll"," down a bit further and select Deploy Branch from the manual deploy menu to start"," a manual build of our application for Heroku. Once our build and deployment have"," been completed successfully, navigate to the top of the page and select Open app to"," see our application. Here is the screenshot of the UI of our application:"," Figure 13.5 – The application UI","Now that we have configured and confirmed that our application UI is loading as","intended, let's configure our Marketing Cloud Journey Builder integration so that we can","ensure our activity is integrated and now loading on the Journey Builder canvas.","\f Setting up your package and environment 381","The Marketing Cloud custom activity configuration","With our Heroku application configured and deployed to production, we now need to","configure the integration on the Marketing Cloud side to ensure that our custom activity","is available as a configurable item on the canvas. To do this, follow these steps:"," 1. We need to navigate to Setup menu within Marketing Cloud and then to the
Installed Packages submenu underneath the Apps section on the page.
2. From there, we'll select New from the button in the upper-right corner in the
Installed Packages menu.
3. Now, we'll give our package a name and description so that we can more easily
identify it when accessing the list of installed packages from the menu screen.
4. Then, once we're inside the configuration of our newly created package, we'll select
Add Component to add the custom activity integration to this package. From there,
we'll select the Journey Builder Custom Activity option as the type of component
that we'd like to create. Then we will fill out the relevant details for our custom
activity integration."," Figure 13.6 – The custom activity configuration menu","\f382 Exploring Custom Journey Builder Activities"," 5. In addition to Name and Description, there are two other inputs for both Category
and Endpoint URL. Category will define what section of the activities menu our
application will reside within. For this example, we'll just set that value to Custom
since it's simply a demonstration of functionality. If we were developing a messaging
service or decision split, then we would utilize the messages and flow control
categories, respectively.
6. Finally, we need to provide the URL of our application where the index.html file
will be served. Simply copy the URL of the application that we tested in our Heroku
configuration step and click on Save.
7. Now, let's navigate to Journey Builder and create a simple flow to see whether our
application is available and ready to be utilized within Journey Builder.
8. Next, we'll need to create the CloudPage that our activity will post to. Additionally,
let's create a data extension that our CloudPage will upsert a payload to, just so
we can more easily confirm that our application is working end to end. First,
navigate to CloudPages and create a JSON Code Resource page that will house
the following script:
"," 9. As you can see, the only functionality that our CloudPage is responsible for is
inserting the payload being sent from our custom activity into a data extension,
called CustomActivityTest. To that end, we'll also need to create a data
extension with this name, along with a field called activityData with the
maximum text value set to empty. Now, when our custom activity posts a request
to our CloudPage, we're able to log the contextual data from Journey Builder and
our application into this data extension. The related visualization is shown in the
following screenshot:
\f Exploring configuration and activity UI 383"," Figure 13.7 – A custom activity in Journey Builder"," 10. That's it! After navigating to Journey Builder, we can now see that our custom"," activity is both available as a draggable activity within the Custom activities section"," as well as configurable on the canvas. Now, users can configure and save the activity"," within their journey flow. The only necessary input will be for our users to enter the"," CloudPage URL, as created in the preceding step, into the input provided within"," our custom activity.","So, we've identified how we can host and deploy our custom application and how we can
integrate it with Journey Builder to appear as a configurable activity on the canvas. But
how is it working? Further still, how can we configure this to actually do something when
a user initiates an activity for execution within the journey? Well, we've covered the basic
implementation of getting our custom activity onto the canvas, so now it's time to really
dive in to see what's going on and how our application will function in production.","Exploring configuration and activity UI","Before we do a deep dive into our application and see how we will configure each item to","create our custom activity, first, let's take an overview of the overall code structure and","call out the key files that we will be reviewing as the key components for creating our","custom activity.","\f384 Exploring Custom Journey Builder Activities","Note that this doesn't mean there is no other utility to the other files in our code base,
but rather the ones we will highlight will affect the overall configuration and unique
implementation of our application. So, let's take a look at the overall project structure."," Figure 13.8 – The custom activity project structure","As you can see, we have quite a lot of files in this directory that all serve a specific","function. Let's highlight the important ones that we'll be reviewing within our","demonstration and describe what each does at a high level:"," • config.json: This file houses the metadata of our application, and it is utilized
by Journey Builder to determine what data we want to accept as inArguments for
our application among other items in our activity configuration.
\f Exploring configuration and activity UI 385"," • index.html: As discussed previously, this will serve as the user interface that our
users see when they configure an instance of our activity.
• customActivity.js: We've already covered this at a high level, but this file
will serve to process the logic from our frontend and fire our Postmonger events,
both to and from Journey Builder, to generate the necessary configuration for
our activity. Additionally, we'll also use this to capture the user input within the
activity and create the payload that we'll use to process on the server-side part
of our application.
• activity.js: This is the file that will contain our logic to execute when a given
route has been called from Journey Builder. Here, we can decode our JWT and
execute logic when our Edit, Save, Validate, Publish, and Execute routes are called
for our application.","Now that we have highlighted the important files that we will be configuring for our
demonstration, with the understanding that these files do not encompass all the important
functionalities and key components of our application, we can move forward and examine
each in more detail. To begin, we'll start with one of the most important pieces of the
custom activity, our config.json file.","The config.json file","For this file, we'll break down our .json file into a few separate sections and analyze each","one. This ensures that we can keep things logically consistent and discuss the purpose for","individual properties without presenting them in an overwhelming context. We'll examine","this file in more depth than other parts of this application because it is so central to how","your application is configured and processed by Marketing Cloud. Let's take a look at the","first part of our file and provide some additional context for our configuration:"," \"workflowApiVersion\": \"1.1\",
\"metaData\": {
\"icon\": \"images/iconMedium.png\",
\"iconSmall\": \"images/iconSmall.png\",
\"category\": \"custom\"
},
\"type\": \"REST\",
\"lang\": {
\"en-US\": {
\"name\": \"CloudPage POST\",
\"description\":
\f386 Exploring Custom Journey Builder Activities"," \"Sample Journey Builder Custom Activity\","," \"step1Label\": \"Configure Activity\""," }"," },","As you can see, in this section, we're going to define some metadata about our application
that Journey Builder will use to place it on the canvas, add a label to our iframed
configuration window, and provide the name of our custom activity as it appears on the
canvas. Additionally, we're configuring our icon files, located in the images folder within
our project directory, to define what icon our activity should utilize when it is placed on
the journey canvas. The properties to further refine, largely due to the ambiguity of their
values, are type and category. Let's define both and show the possible values for each
here, for further reference:"," • category: This value determines which part of the Journey Builder activity menu
your application will reside within. Although we also can set this value within our
installed package, as you saw earlier in our configuration steps, our setting in this
file will override that value if we should choose to set it here. The list of accepted
values here are as follows:"," Message (messaging activities)"," Customer (customer update)"," Flow (decision splits)"," If the value that you provide for this property does not match any of these options,"," then Journey Builder will default the category selection to custom. In this"," scenario, your activity will appear within the custom activities menu of"," Journey Builder."," • type: This is a property that maps to the type of activity you are using to create."," It is similar to a category but more granular, and its value must be representative of"," an approved activity type that is represented within the Journey Builder JSON data"," structure. Valid options for this configuration are as follows:"," MultiCriteriaDecision
DataExtensionUpdate
EMAILV2
EngagementDecision
randomSplit
\f Exploring configuration and activity UI 387"," Wait
Rest
These values should be somewhat self-explanatory and map to already existing activities
that you will find on the Journey Builder activity menu, such as EMAILV2, which
corresponds to an Email activity, and Wait, which corresponds to Wait activity within
Journey Builder. For generic custom activities that do not easily confirm an existing type,
utilize the Rest value to accommodate these use cases.
Now, let's take a look at the next part of our config file to see what additional items
we are configuring for our activity:"," \"arguments\": {
\"execute\": {
\"inArguments\":[
{\"subscriberKey\":\"{{Contact.Key}}\"}
],
\"outArguments\": [],
\"url\": \"https://automating-mc-jbca.herokuapp.com/
journeybuilder/execute\",
\"verb\": \"POST\",
\"body\": \"\",
\"header\": \"\",
\"format\": \"json\",
\"useJwt\": true,
\"timeout\": 10000,
\"retryCount\": 5,
\"retryDelay\": 100
}
},","In this part of our config file, we define both the route and information that will be sent
every time an instance of our custom activity is executed within Journey Builder.
First, let's examine the inArguments object. This specifies the data that will be sent
to our application whenever a subscriber enters our activity. While we can set this data
during the configuration event of our activity instance, we can also define values here that
will be passed.
\f388 Exploring Custom Journey Builder Activities","In our preceding example, we want to pass an attribute called subscriberKey that","will equal the value of ContactKey for the subscriber that has reached our activity in","the journey. Notice that we are using a data binding syntax to programmatically populate","that value for our payload depending on the subscriber's data rather than some","global configuration.","There are numerous options for pulling data here, such as from our Journey Builder","default email/phone values or an attribute from our entry source event. But for the","purpose of our example, we'll keep it simple and only pass ContactKey to","our application."," Tip
For further information on the capabilities of data binding, please refer
to the Salesforce documentation located at https://developer.
salesforce.com/docs/marketing/marketing-cloud/
guide/how-data-binding-works.html.","In addition to this, we could also capture some responses from our application and return
these values for further processing by other activities within Journey Builder by using
the outArguments property in our config file. This is beyond the scope of our book,
but it's an important concept to note for scenarios where the returned data is needed for
further processing by an activity. For the remainder of this section, we're defining further
parameters for Journey Builder to process when executing the activity. Let's briefly look at
a few, as follows:"," • Url: This is the route that Journey Builder will call when running the execute
method inside our activity instance. With our project set up, this will equate to the
URL of our application that the journeybuilder/execute route appended to
the primary URL.
• useJwt: We can use this setting in any of our routes to let Marketing Cloud know
that we wish to receive a JWT to validate that the call is coming from Journey
Builder. For our execute method here, we'll set this value to true.
• timeout: How long (in milliseconds) do we want to allow our application to continue
processing until it times out? The default value for this is 60,000 ms.
• retryCount: How many times would we like to retry the execution of our activity
after it has timed out, in accordance with the window we have set in the previously
defined property? The default value for this property is 0.
• retryDelay: How long would we like to wait (in milliseconds) before we retry our
execute method?
\f Exploring configuration and activity UI 389","The remainder of our config file primarily consists of setting the configuration
arguments for the remaining routes, apart from execute, for our application: Edit,
Publish, Stop, and Validate. For our example, we will not further define these
properties as extensively as our execute method. The only route that must be defined
here is Publish. Finally, let's take a look at the final part of our config file that we wish
to highlight:"," \"wizardSteps\": [
{ \"label\": \"Configure Activity\", \"key\": \"step1\" }
],
\"userInterfaces\": {
\"configModal\": {
\"height\": 640,
\"width\": 900,
\"fullscreen\": false
}
},","Here, we are defining some basic properties regarding the configuration frame of our
activity as it appears on the Journey Builder canvas. The wizardSteps property is an
object array that defines the possible steps to navigate through our configuration of
the activity.
Here, we might define the different labels and steps of a multi-step activity on the canvas.
However, for our example, we only need the primary configuration. So, we'll keep our
configuration short and simple. Finally, there is the userInterfaces property. This is
a required element for the UI and defines the overall height and width values of our
configuration menu. As you can see in our preceding example, we've elected to show our
custom activity configuration modal at 640 x 900, and we don't want our modal to appear
as a fullscreen configuration window in Journey Builder. Now, let's take a very quick
glance at the application UI just to make some quick notes.","The index.html file","As we defined earlier, this file is the UI of our custom activity and must live in the root","of the project directory. For this file, we could load in any number of CSS of .js files in","order to dynamically render our frontend or even use templating engines to modularize","our UI and deliver a more templated content delivery system.","\f390 Exploring Custom Journey Builder Activities","Our use case is simple; we only need to provide input for our users to enter in
a CloudPage URL that our activity can make a POST request to. The only important thing
to note here is that we want to use require.js to load our customActivity.js
file, which is located in the js folder, as a dependency so that we can utilize the
functionality within that file to make our UI interact with Journey Builder and perform
its intended purpose. To do this, we'll simply include the following script within our
index.html file:","
","Then, we'll create a simple input that will capture our CloudPage URL, which
our customActivity.js file will pull to pass to our server-side scripts for
further processing:"," ","That's all we need to configure in our config file and frontend in order to make our
application recognizable to Journey Builder and to provide our users with an opportunity
to configure an instance of an activity. Now, let's take a look at the Postmonger events
and payloads that will interact with Marketing Cloud in order to process data to and from
our application.
\f Using the Postmonger events and payloads 391","Using the Postmonger events and payloads","We've highlighted the important items in our config file, along with the config.","json and index.html files for our frontend. Now, we'll turn our attention to the","customActivity.js file and go through it in parts, in the same way as the previous","section, to identify what is happening within our application.","First, we'll initialize a new Postmonger session in order to utilize events defined","by Marketing Cloud that allows you to interact with Journey Builder:"," var connection = new Postmonger.Session();"," var payload = {};"," $(window).ready(onRender);"," connection.on('initActivity', initialize);"," connection.on('clickedNext', save);","Here, we'll call a function, named onRender, that will execute once the Document
Object Model (DOM) is ready and we can begin processing our JavaScript functions.
By utilizing our onRender function, in the initial configuration of our activity, this event
will call the initActivity Postmonger event, which will initialize our activity and
pass any of our inArguments objects defined within our config file to our application.
Once we have configured our activity, it will broadcast the configured payload that
we will further define in another function implemented in this file. This is so that
we can retain our saved configuration input even after we've closed and reopened
our activity configuration.
Next, we will tell Postmonger that, upon execution of the initActivity event,
we would like to execute a custom function, called initialize. Let's take a look at how
our initialize function is constructed:"," function initialize(data) {
if (data) {
payload = data;
var setcpURL = payload['arguments']
.execute.inArguments[0].cloudpageURL;
$('#cpURL').val(setcpURL);
}
}
\f392 Exploring Custom Journey Builder Activities","In this function, we are passing in a parameter called data that is equal to the
inArguments object array, which we will define within our customActivity.js file.
If our activity has been configured previously, meaning a user has added an input for the
CloudPage URL in our UI, we will pull that value from the payload and pre-populate that
data as the value for our input of the UI for this instance of our activity. This ensures that,
when users return to the configuration of our activity before activating the journey, then
their previous configuration will be retained and they will not have to reinput a value to
process the activity successfully.
Finally, we'll execute a custom function, called save, whenever the clickedNext
Postmonger event has been fired within Journey Builder. This event is triggered
when a user clicks on the Next button within the configuration modal of the activity
configuration modal in Journey Builder. Normally, we could use this event to display
multi-event configuration menus by serving different menus to our users whenever this
event has been triggered.
For the purposes of our example, and since our configuration consists solely of a single
menu, this event is triggered whenever the user clicks on Done. This is because there are
no other configuration steps to complete. Now, let's take a look at our save function:"," function save() {
var cpURL = $('#cpURL').val();
payload['arguments'].execute.inArguments = [{
\"subscriberKey\": \"{{Contact.Key}}\",
\"cloudpageURL\": cpURL
}];
payload['metaData'].isConfigured = true;
connection.trigger('updateActivity', payload);
}","As you can see, in our save function, the first thing we want to do is capture the
CloudPage URL input for our activity since this is the primary driver of functionality
and is the only configurable property within our activity. Then, we'll redefine our
inArguments object array to contain both the ContactKey of the subscriber that
is entering our activity in the journey and the input that our Journey Builder user has
specified for the CloudPage URL.
\f Implementing handlers and authentication 393","Finally, we'll set the isConfigured property for the metaData object in our payload
to be equal to true. This is a key step because it is necessary for Journey Builder to fully
recognize our activity as having been configured and ready for activation. Without setting
this value to true before attempting to publish a journey containing your activity, the
journey will fail to publish, and you will be required to reconfigure the activity until this
value has been set.
Finally, we'll execute the updateActivity Postmonger event. This event takes the
configured payload as a parameter, and executing this event will close the configuration
window within Journey Builder for your custom activity and save the payload information
that you have passed to it in the canvas. From there, we can reopen the configuration of
our activity and retain the payload data that we have defined within our save function
in order to pre-populate the CloudPage URL data in the same way that we did for the
initialize function earlier.
That's the only configurations we need to process within our customActivity.js file
for our application to process events within Journey Builder and create a payload that we
can then utilize on the server side of our application to perform the actual POST request
to our CloudPage that will save our user data to a data extension.
It should be noted that this is a very simple representation of all the possible events and
functionalities that can be utilized within this file in order to accomplish your use cases
within Journey Builder. We could programmatically access the eventDefinitionKey
journey to dynamically retrieve data extension attributes at configuration time for
our payload, access tokens for further API processing, along with many other types of
functionalities that can provide enhanced integrations and event services between our
application and Journey Builder. With our client-side scripting, configuration, and UI
ready to go, let's take a look at how we can authenticate our token from Marketing Cloud
and perform the final set of functionalities required for our application.","Implementing handlers and authentication","So, we're now able to add our application to Journey Builder, load our application UI,","and both save and initialize our activity data within Journey Builder for further processing","and configuration. Now, we need to actually do something on the application side with the","payloads configured from Journey Builder. To do this, we'll examine some of the contents","of our activity.js file in order to determine how to create the final pieces","of functionality.","\f394 Exploring Custom Journey Builder Activities","Before we can begin, we need to complete one final piece of configuration for our
application. Since we only want our execute method to process if we've identified that
the request is coming from Journey Builder, we'll need to decode and authenticate the
JSON Web Token posted to our execute route. To accomplish this, first, we'll navigate back
to our installed package within Marketing Cloud.
In the primary package details, we'll want to copy the value of the JWT Signing Secret.
This will allow us to decode the JWT and confirm that the request origin is Journey
Builder. To do this, we'll use the following steps:"," 1. With the signing secret in hand, navigate back to the Heroku application menu and
select the Settings tab from our primary application configuration page.
2. From there, scroll down to find the Config Vars section. Config Vars allow us to
securely set and store environment variables that can then be called throughout
our application programmatically and will allow us to both safely store our signing
secret and utilize it in our code base in order to decode the JWT.
So, let's create a config var, called jwtSecret, and assign it a value that is equal to
the signing secret that we copied from our Marketing Cloud package."," Figure 13.9 – Heroku Config Vars"," 3. While we won't examine it in depth, in our code base, we have defined a JavaScript"," file in our lib directory that contains a simple function for decoding the JWT by"," taking both the JWT passed from Marketing Cloud and the signing secret in order"," to validate whether the request has been successfully authenticated. We'll just need"," to ensure that our activity.js file has access to this function by requiring it"," within our file. We can do this by utilizing the following:"," const JWT = require(Path.join(__dirname, '..', 'lib',"," 'jwtDecoder.js'));","\f Implementing handlers and authentication 395","Now, we're all set to code the functionality that will execute whenever Journey Builder
calls a given route within our application. For all routes, other than our execute route,
which is triggered when a contact enters our activity in a running journey, we'll simply log
some request data to the console and then send a 200 response back to Marketing Cloud
to let it know that the POST request to our route was handled successfully. No actual
functionality will be processed when these routes are called other than logging, and the
general structure for each of these handlers will look like the following:"," exports.route = function(req, res) {"," logData(req);"," res.send(200, 'someRoute');"," };","However, for our execute route, we'll want to authenticate the request and then process
the payload data before setting up our POST request to whatever CloudPage the user has
specified. Let's dive into this route and see how we can implement this functionality:"," 1. First, we'll define our execute handler and then run our JWT decode function
against our signing secret:
exports.execute = function(req, res) {
JWT(req.body, process.env.jwtSecret, (err, decoded)
=> {
if (err) {
return res.status(401).end();
} else {"," Here, notice that we utilize the process.env.jwtSecret syntax as"," the parameter for our signing secret. We can utilize the process.env."," configVarName syntax to access any environment variable that we have"," configured within our settings menu in the application configuration screen."," Then, we'll run a check to see whether an error was returned during the decoding"," of our JWT. If one was returned, we'll return a 401 Unauthorized status and end"," the processing of our execute method. If no errors were returned, our logic will"," continue processing and our decoded payload values will now be accessible within"," the decoded JSON object output as a result of our decode function.","\f396 Exploring Custom Journey Builder Activities"," 2. Now that we have authenticated our request and have a decoded payload to
work with, we'll further check to ensure that inArguments are available for
us to process:
if (decoded && decoded.inArguments &&
decoded.inArguments.length > 0) {
var decodedArgs = decoded.inArguments[0];
var cpURL = decodedArgs.cloudpageURL;
var subKey = decodedArgs.subscriberKey;
else {
console.error('inArguments invalid.');
return res.status(400).end();
}"," Since our inArguments data contains all the relevant data that is needed for
our application to function, such as ContactKey and the CloudPage URL,
we'll only process our execute method further if this object array is available
and has data to process. If the data is not available in the decoded payload, we'll
return a 400 Bad Request status from our execute handler and end any further
processing. Assuming that our decoded payload does contain this data, we'll then
grab the root node from our inArguments object array and assign variables
to our CloudPage URL and ContactKey data, which we will then process to
our CloudPage.
3. Finally, we'll construct the last piece of our custom activity that will take our
CloudPage URL and ContactKey data and form a POST request that our
CloudPage will receive and upsert directly into our data extension:
var cpPostBody = JSON.stringify({
\"activityData\": {
\"SubscriberKey\": subKey,
\"CloudPageURL\": cpURL
}
});
request.post({
headers: {
\"Content-Type\": \"application/json\"
},
url: cpURL,
body: cpPostBody
\f Implementing handlers and authentication 397"," }, function(error, response, body) {"," if (error) {"," console.log(error);"," };"," });"," logData(req);"," res.send(200, 'Execute');"," First, we'll create our request body, which will consist of our ContactKey data
and the CloudPage URL in JSON format for our request. Then, we'll utilize the
requests library in order to make a simple POST request to the URL defined in
our activity configuration. If an error occurs during our request, we'll log that data
to the console, but we'll return a 200 status back to Journey Builder regardless of
the processing of our logic at this point of our activity (though we certainly could
provide additional error handling or retries).
That's it! We've now fully configured and deployed our custom activity to Heroku, and it
is ready to implement within Journey Builder across journeys and points within our flow.
Now, whenever a user drags our activity onto the canvas, configures a CloudPage URL,
and activates our journey, any contact who enters our activity will have their contact key
data logged to the data extension that we configured earlier.
Before we wrap up, there's one more topic that we can touch on briefly that you might
find particularly useful during the development, testing, and deployment of your
custom activity.","Utilizing logging
While it's always nice when things go smoothly, you might run into errors or unidentified
issues as you go through the tutorial in this chapter whose underlying causes can
be difficult to determine. To that end, do not hesitate to utilize logging within your
application to determine points of failure or to output sample data from your payload in
order to aid you in your investigations or development.
\f398 Exploring Custom Journey Builder Activities","Heroku offers a simple method for accessing the logs of your application. From your
application configuration and settings screen, simply click on the More drop-down menu
on the right-hand side of the page and select View Logs. On this screen, you will see all of
the logged events from your application. This will include deployment statuses, requests
from Journey Builder for applications resources such as the config.json file, whenever
a route is posted from Journey Builder, and any custom logging within your application.
Let's take a look at what a sample log for an application might look like when starting up
following a new deployment with Git:"," Figure 13.10 – The Heroku logs","As you can see here, we are able to retrieve the logs of individual events related to our","deployment, including the status of our application's state. In addition to this information,","which can help find errors related to the application's state or deploy status, we are also","able to view any relevant information logged from our application.","The amount of data returned here is somewhat limited, particularly if you are testing","heavily, but you can also process these directly from the command line by tailing your logs","with the Heroku CLI. Setting this up is outside the scope of this chapter, but you should be","able to find numerous resources on Heroku's website and across the internet for accessing","this data with ease.","If you're finding yourself stuck, or just want an idea of what your configured payload looks","like before writing your execute logic, never hesitate to write to the console in order to","more easily determine where things are going wrong and how you can resolve them.","\f Summary 399","Summary
We covered a lot in this chapter and, although our sample activity was simplistic and not
entirely indicative of a production use case, I hope that it helped empower you to create
new functionalities within Journey Builder that automate your use cases and extend the
capabilities of your Marketing Cloud account.
After reading this chapter, you should have the ability to build and deploy an application
through Heroku and be able to integrate the application directly within the Journey
Builder UI in order to provide enhanced functionality to both technical and non-technical
resources alike. With even just a few simple tweaks to our CloudPage, we could extend
platform functions as real-time events within Journey Builder.
In the next chapter, we'll take a reflective tour of the journey in terms of what we have
learned within this book and the lessons and tools that we can take away from this
resource in order to become more competent, powerful, and efficient developers within
our organizations and in the Marketing Cloud community as a whole.
\f Section 4:
Conclusion","This section is a wrap-up of the book, giving a few quick recaps and last-minute tips and
tricks. This section contains the following chapter:"," • Chapter 14, Carpe Omnia","\f 14"," Carpe Omnia","You did it! You read the whole book! We are very proud of you and happy that you","liked our book enough to read it from cover to cover. Well, we hope you did at least.","Automation for Marketing Cloud can be a daunting topic and is certainly one that takes","a bit of effort to learn about, so we want to congratulate you on all your efforts!","You may be asking yourself, What does this fancy chapter title mean? Is it just some attempt","to make the authors feel smart? Well, maybe a little bit but it also has a great meaning,","albeit a bit greedy. Carpe omnia means seize everything in Latin. This, I feel, is something","marketers, developers, professionals, and everyone in general should take to heart. If you","see an opportunity, whether noon, night, or break of dawn, seize it! I feel like, if there is","one way to end this book, it is with words about moving forward and getting the best out","of life that you can.","Our goal with this last chapter is to go over everything you just learned and make sure","you did not miss anything or to discover whether you want to go back to something","for a refresher. There were a ton of topics in this book, ranging from theory to practical","examples to development outside of Marketing Cloud. With all these topics, it can","certainly be easy to have missed things or for things to have gotten confused in some way.","Hopefully, with this last chapter, we can help you discover anything you might want to go","back and reread.","As this is goodbye, for now, we are also going to give a few last-minute words of advice","and tips and tricks to help you with automation in Marketing Cloud. Not every bit of","knowledge, best practice, or hack could make it into the book, so there are a ton more for","us to share here.","\f404 Carpe Omnia","Then, we will give a final farewell and wish you the best of luck for your adventures in
Marketing Cloud and hope to meet again! For now, let's take a look at what you have
learned over the last 13 chapters.
This chapter is divided into the following sections:"," • Lessons we've learned so far: A brief look back over the last 13 chapters and all that
we have learned to this point
• Final lessons and examples: Our last two quick lessons and examples to share for you
to use in your future endeavors in Marketing Cloud
• Last-minute tips and tricks: A few last-minute words of advice from us to share with
you in the hope it will help guide you to further success","Lessons we've learned so far","The first thing we want to do in our closing chapter is to make sure we covered everything","we wanted to. Throughout this book, we have explored many different things in many","different ways. Some of it was more theoretical than other parts – but all of it was very","useful and very relevant. Our goal in this section is to ensure you absorbed what we were","hoping you did and that when we leave you, we leave you best prepared for the future.","We went from learning the basics of automation and its uses and then went all the way to","building your own custom activities and microservices to interact with Marketing Cloud.","Quite a journey to go on all in a single book!","With that being said, our journey began with a bit of theory upfront where the first","few chapters were completely focused on the setup and practical definitions of key","philosophies, thoughts, and other highly important factors and related items. This setup","was hugely important to help not only to pave the way forward for the rest of the topics","we discussed but also because understanding these aspects allows you greater insight and","innovation in your day-to-day activities, both inside and outside of Marketing Cloud –","helping to lead you toward further growth.","We then concentrated on automation possibilities and capabilities within Salesforce","Marketing Cloud. Through exploring the proprietary languages (such as AMPscript and","SSJS), the native tools (such as Journey Builder and Automation Studio), and other less","obvious possibilities, we were able to get a strong grip on not only what works but to build","a strong base to better figure out what is possible for future automations from within","\f Final lessons and examples 405","After we dove deep inside of Marketing Cloud, we started our adventure into the possibility
of automating Marketing Cloud from outside of Marketing Cloud. By exploring custom
API interactions and integrations, SDKs, webhooks, and microservices, we were able to
find ways to do so many tasks in Marketing Cloud without ever opening the user interface
at all. We also found many ways to use those as well as custom Journey Builder activities
to further supplement and integrate with Marketing Cloud to greatly increase capabilities
and performance. That catches us up on all that we went over. That was a lot, wasn't it?
The main aspect that I hope was gathered from this book is the ability to take what was
written here and grow something of your own, instead of just copying what is shared. The
information here is powerful, but with a growth mindset, someone can take this power and
make it stronger. The possibilities in Marketing Cloud are boundless. This whole book was
just centered around automation inside Marketing Cloud, which is only a single part of the
platform's capabilities. Imagine the series of books that would be needed to cover every
aspect. I think it would be too much for one person to read and fully understand!
We are very glad to have shared this with you, but we aren't finished just yet. Next, we are
going to give a couple of extra insights into automation for Marketing Cloud that we could
not fit into the other chapters.","Final lessons and examples","Now that we have gone over everything you should have picked up from our book,","we wanted to give a few words of advice and some further places to investigate Marketing","Cloud automation. Unfortunately, trying to document every single aspect of automation","for Marketing Cloud is a task that would require an encyclopedic-type book series, which","is a bit much for this book.","For that reason, we wanted to promote two more important things to consider when","working with automation in Marketing Cloud. First, we will investigate another way for an","automation to be triggered – from another automation!","Calling an automation from another automation","The cool thing about this method is that it can string automations together without having","to rely on syncing schedules or dropping files. This will allow you to utilize a Script","activity in Automation Studio to call the next automation you want to run.","\f406 Carpe Omnia","How does this work?","Well, the Script activity would use WSProxy or other functionality to interact with the","SOAP or REST API. From there, you would just use the corresponding object or endpoint","to start your automation. As a note, many people stick with the WSProxy version as it","does not require the application and component clientId and secret in order to","work – reducing security risks.","Here is a quick example script using WSProxy to start another automation:","