Simplifying Deploy and Destroy with the SF CLI

Personally, I never had any problems  with the SFDX CLI, which is why I was surprised they were replacing it with the SF CLI. What I now sometimes struggle with is translating all of my cheat sheet references from one to the other, and the sifting through the plethora of search results that still refer to the SFDX way of doing things. OK, I know I can minimize the second part by setting the date range in my Google searches, but when I’m being lazy (or in a hurry, both of which are more common than I prefer) I use Perplexity.ai rather than Google.

Most recently my conversion conundrum cropped up with removing components that I had deployed through the CLI. I found a great SFDX syntax in Perplexity literally one day before it was slated to be retired. Here is the relevant part of the session that got me to the solution:

Question: cli to delete objects from an org

Answer:
To delete objects from a Salesforce org using the CLI, you can use the destructive changes process. Here’s how to do it:

Step 1: Create Destructive Changes XML

Create a file named destructiveChanges.xml with the following structure:
<?xml version="1.0" encoding="UTF-8"?>
<Package xmlns="http://soap.sforce.com/2006/04/metadata">
    <types>
        <members>ObjectName1__c</members>
        <members>ObjectName2__c</members>
        <name>CustomObject</name>
    </types>
</Package>
Replace ObjectName1__c and ObjectName2__c with the API names of the objects you want to delete.

Step 2: Create Empty Package XML

Create a file named package.xml with the following content:
xml
<?xml version="1.0" encoding="UTF-8"?>
<Package xmlns="http://soap.sforce.com/2006/04/metadata">
    <version>62.0</version>
</Package>

Step 3: Deploy Destructive Changes

Run the following Salesforce CLI command:
bash
sfdx force:mdapi:deploy -d . -w -1 -u YourOrgAlias
This command deploys the destructive changes to your org:
  • -d . specifies the current directory as the deployment source
  • -w -1 waits for the deployment to finish
  • -u YourOrgAlias specifies the target org alias

Additional Considerations

  • Ensure you have the necessary permissions to delete objects in your org.
  • Deleting objects will also delete all associated records and metadata (fields, validation rules, etc.).
  • Some objects may have dependencies that prevent deletion. You might need to remove these dependencies first.
  • Always test this process in a sandbox environment before applying it to production.
  • Consider backing up your metadata and data before performing destructive changes.
If you need to delete other types of metadata (like Apex classes, Visualforce pages, etc.), you can include them in the destructiveChanges.xml file using the appropriate metadata type names.
Remember, this process permanently deletes the specified objects and their data from your org, so use it with caution.

When I ran it, it worked perfectly, and output a warning that it would no longer work the next day (I did not preserve the message content, but I can tell you the date was 11/4/24 because UpNote metadata is useful). As I mentioned, I’m often in a hurry, so I often make mistakes, and what has kept me in technology since the original Wolfenstein was released is always making sure I have a way of undoing whatever I do. So I decided to go find the cool new sf CLI solution. Which was not terribly clear to me. I found the documentation a bit confusing. Going back to the old-fashioned Google search (trust me, it pains me more to write “old-fashioned Google search” than it does for you to read it), I found a Trailblazer Community post by Max Goldfarb that greatly clarified how to make the new sf call.

I have since refined the process clarified by Max once I read the documentation while in less than my usually rushed efforts. First, I keep that step 2 XML that Perplexity gave me stored in my repo as antiPackage.xml, the full path being manifest\destructiveArchive\antiPackage.xml. I could then generate the destructive xml with:

sf project generate manifest --source-dir force-app/main/default/[WHATEVER-I-WANT-TO-DESTROY] --type destroy

and then run my simplified killer command:

sf project deploy start --manifest manifest/destructiveArchive/antiPackage.xml --pre-destructive-changes manifest/destructiveArchive/destructiveChangesALL.xml -o [ALIAS]

and boom! stuff removed. I even keep a set of destructive xmls in a special folder (you may have noticed it in the command above). I continued to use this happily for a few weeks.

Today I was messing with a single component that was giving me problems (or, more accurately, I was giving it problems by building it incorrectly) and started by removing just it. Then I went about making my fixes and, feeling less hectic than usual, decided to have a package.xml just for it so the deployment would go faster (yes, I know I can do that with a right-click, but once I am comfortable with a CLI, I prefer it, and it lets me make sure that my eventual release script will have fewer issues). It was very small, which made the content leap out at me more, and, so I realized…There is no difference between the XML output with or without the “–type destroy” switch!

So now I just keep one XML for each context in manifest\deployArchive and reference it for both deploy and destroy.

Facebooktwitterredditlinkedinmail
© Scott S. Nelson
Unfollow to Finish Quest

Completed Salesforce Quest Incomplete

Every month I check to see what’s new at https://trailhead.salesforce.com/quests (which doesn’t seem to update monthly anymore, but randomly following marketing events).  Once in a while, I will complete all of the steps of a given quest and it still shows less than 100% complete.

In previous cases, it has been because some module that I had completed in the past had added a new step. So, while that module was complete on my profile, the current version of the module was incomplete. My guess is that the individual steps are reading from my profile and the aggregated score is based on current state. Anyway…I have learned that when the completion status doesn’t match to look at modules that are over a few months old and find the one with the new task.

Today, my tried and true method failed me. Everything was complete, but I still had only 88% complete on the summary. Hmmm. I got lucky and found Community thread with my first query that had the answer (fortunately near the top, as the thread continues with several more comments that are less helpful), which is to click Unfollow, then go through and click all of the “Mark as Complete” buttons again.

Noting this here so I remember for the next time.

HTH

Facebooktwitterredditlinkedinmail
© Scott S. Nelson
Scott S Nelson Certified Service Cloud Consultant

Passing the Salesforce Service Cloud Consultant Certification Exam

I’ve written about the process I have gone through for all of my Salesforce certifications.  The Certification Prep section of my blog currently starts with these, and I believe that many of those posts also have some helpful tips for the Service Cloud Consultant Certification. If you haven’t already passed the Administrator certification, I would suggest starting with my Tips to Pass the Salesforce.com Administrator Certification Exam post. Enough self-promotion, on with the sharing!

As mentioned earlier, this isn’t my first post on certification approaches and if you are preparing for Service Cloud Consultant certification it isn’t your first exam, so I’m going to minimize the elocution here and just drop my formatted notes by section headings for easier reference.

Start with Trailhead

The proscribed place to prep, completing the Service Cloud Specialist Superbadge will have you well prepared for a passing grade if you work all of the prerequisites and then the Superbadge itself. I did complete the prerequisites but have not yet done the Superbadge project. This is much more a comment on my other time commitments than the approach as I highly recommend completing the Superbadge project, preferably right before taking the exam.

If you also have a reason to not be able to fit the Superbadge into your preparation plan, I recommend completing the Get Started with Service Cloud for Lightning Experience trail. Some of the trail modules are part of the Superbadge prerequisites, so it will take less time than you might think.

Quizlet

Quizlet is a great free resource for some exams, and the Service Cloud Consultant is one of them. https://quizlet.com/272794451/salesforce-service-cloud-consultant-flash-cards/

Udemy

I’ve used Udemy to prepare for every Salesforce certification, and have already enrolled in the Salesforce Data Architect Course for my next planned exam because it was on sale. For those who haven’t used Udemy before, they have frequent sales where the prices are drastically below their regular price. By signing up for their marketing notifications you will eventually get a feel for how low particular courses can drop to, so if you have some planned, buying on sale is a great strategy.

Returning from that digression (my regular reader is used to this), I enrolled in the Salesforce Service Cloud Consultant Certification Course by Mike Wheeler because I had previously taken his Platform App Builder course (as mentioned in Become a Salesforce Certified Platform App Builder) and found it helpful in preparation. I will admit I was disappointed with the Service Cloud course. It was recorded in 2018, and while the Udemy listing says it was last updated 11/22, I couldn’t see where. He continually points out Lightning issues that have long since been addressed and spends a lot of time in Salesforce Classic, which is no longer referenced in the exam. And, while the functionality of Live Agent changed very little when being re-branded to Salesforce Chat, there are a couple of questions in the exam where the Live Agent option is the wrong answer.

And, for the record, I do not get a commission if you enroll in a Udemy course I recommend…and not for lack of trying. Their affiliate program has too much friction to bother dealing with (and it is reflected in my losses as a shareholder).

Focus on Force

I won two vouchers this year (so far, fingers crossed) with Trailhead Quests. The first voucher was for a $200 exam and the second was for a $400 exam. My certification path is focused on Technical Architect and I had done all of the $200 exams, so I sat on that voucher for awhile. When I won the $400 voucher I was a bit surprised to find that it had a shorter expiration period. I immediately scheduled my exam for the expiration date and plowed into My Sharing and Visibility Architect Path.

I rested my brain for a couple of days and decided to use the first $200 voucher on the Service Cloud Consultant certification (sometimes called just Service Cloud Certification). For the Sharing and Visibility Architect exam I tried a few Udemy practice exams because they have served me well for previous exams. I requested refunds for all but one, and that is because I had been to busy to start on the first one and the guarantee period had expired. They were awful. I then went to the Trailhead Community and asked folks there for a recommendation and discovered Focus on Force.  I will keep looking for study courses on Udemy, but for Salesforce practice exams, Focus on Force will be my go-to from now one.

My process was to first go through all of the Topic Exams and then start on the Question Bank.  Then I had some issues with Question Bank on mobile, so I did practice exams on mobile and Question Bank on PC. Once completing the first 20 Question Bank exam, I found I needed to focus in these exam areas:

  • Contact Center Analytics
  • Interaction Channels
  • Knowledge Management
  • Solution Design

This is one of the reasons I don’t consider certifications a true test of consulting skill. I have delivered well-received proposals and solutions using Knowledge Management, and am regularly approached for my solution design expertise. The exam questions cover some narrow areas of very broad topics, and the questions I missed are are about activities that are generally one-and-done… then forgotten and looked up again when next needed. But, certification is important in the Salesforce landscape, so I spent time drilling on things that I would still have to look up again in a couple of years.

I went through the Udemy course in parallel, partly because I only had 55 days to prepare and a demanding day job, and partly to see if this approach was better than first doing the course and then using the practice exams.

Where previously I found the feature to check questions individually instead of at the end of the exam useful, this time I found that I did better if I waited until the end. I think this has a lot to do with my not knowing as many answers as the start as I had for the Sharing and Visibility exam (which I found surprising in itself) and that my expectations changed as I saw immediately that I was wrong. Unless you have an eidetic memory your frame of mind can impact your score more than the knowledge you have accumulated.

The Focus Review feature has the same issue as the Question Bank when used in Chrome on Android mobile devices. The score calculation at the end fails to complete. It then remembers the answer state the next time either is tried. Because both use random questions, some will have the answers from the previous session. I reported this twice for Question Bank and once for Focus Review and no fix has happened yet. If you run into the issue, please report it and then stick to using a PC for those test types. The answers from the failed mobile session will still be there the first time but subsequent attempts will work properly so long as you don’t try mobile again like I did (sometimes I’m optimistic when I shouldn’t be).

Bionic Reading® Notes

I use https://10015.io/tools/bionic-reading-converter to format my notes for Bionic Reading®. Below are the ones I made to review just before the exam. They are specific to reminders I thought would be useful as I created the notes and I recommend you create your own, or supplement these with your own.

For the Industry Knowledge questions, when not sure always go with the one with the highest cost savings followed by the one with the most potential income result. Again, this is only when unsure. There are some questions where cost is not the key factor of the question, for example when considering the benefits of an email channel, lower cost may not be the correct answer as there are other options that are a lower cost than email.

For processes, Case Stages are driven by the Case Status field

CTI allows telephony services in Salesforce. No desktop software or softphone required.

Customer SLA = Entitlement

List views are automatically created for queues

Customer Service site template for Questions to Case, not Customer Portal

Console History component shows recent primary and sub tabs. Recent items shows records

Email to case has a limit of 2500 per day

Knowledge does not return solutions only articles that are related to similar cases or questions

Messaging is what was called Live Messaging and not related to Social

Enable Case Comment Notification to Contacts is a support setting

There is no case field alert

Email approvals require Draft emails

Service Console requires Service Cloud User license

Knowledge Publication Teams and Publication States do not exist

In the routing model, you choose whether to push work to agents who are Least Active or Most Available. If you select Least Active, then Omni-Channel routes incoming work items to the agent with the least amount of open work. If you select Most Available, then Omni-Channel routes incoming work items to the agent with the greatest difference between work item capacity and open work items.

Internal metrics focus on what happens inside the contact center, and external metrics focus on what happens outside the contact center.

Case Sharing Rules by Record Owner:
Public Groups
Roles
Roles and Subordinates

Facebooktwitterredditlinkedinmail
© Scott S. Nelson

Salesforce Native vs App vs Connector

 

Fair warning: This is more about not having written anything in a while than the value of the topic…and the subject matter is more about drawing your own conclusions than relying on what is easily available, so…

App is one of the most over-used and ill-defined terms in the IT lexicon. This is greatly due to it being used by people outside the IT domain. The domain itself has had some whoppers, like the DHMTL that was a must-have at the turn of the century even though the only honest definitions of the term were that it had no real definition. Microservices runs a close second simply because there is an invisible grey line between SOA and Microservices that is a mile wide and an inch short. But I digress, as is often the case.

What I’m really thinking about today is apps in the world of Salesforce.com. Specifically, apps that run inside the Salesforce CRM platform. I started thinking about this because I was looking into CPQ vendors over the weekend to refresh myself on the market to support a project proposal to select the best option for a particular business. It’s a large space, so it always helps to find someone else’s list to start with and someone had given me a list from a major analyst group as that starting point.

Other than analysts, no one likes long lists with lots of details, so I first wanted to narrow it by those that integrated with Salesforce. It didn’t take me long to remember that Salesforce is the gold standard for CRM and there were only two that didn’t. I didn’t go through the whole list to get to that count because I’ve done these kind of evaluations before and figured out after the first half dozen that this was not how I was going to narrow the list. The two were just what was noticed while skinning this cat another way.

The first trimming of the list was by industry focus. The potential client is a tech service, sort of SaaSy, and “High-tech products” was one of the categories, which was much closer to what they did than “Financial services” (though they have customers in that domain) or “Industrial products” (which the analyst seemed to think usually included high-tech, though not sure why).

To spare you the tedium of the several hours of wading through 1000’s of lines of marketing prose that could have been delivered in a table (ahem, yes, I know, kettle, black, etc.), from just the perspective of Salesforce CRM integration, I found it useful to divide them into three basic styles:

Native: An application that is built entirely in Salesforce
App: An app that runs inside Salesforce that depends on data and/or functionality managed outside of Salesforce.
Connector: An application that runs independently of Salesforce and has a way to share data with Salesforce.

The terms for these distinctions change often over time and between sources. These definitions are for clarification of the table below and are purposely simplified as deeper distinctions are less relevant about integration than other aspects.

In this particular exercise, the ask was to provide some pros and cons to these different styles. My style being one of adapting general terms to technical solutions, I responded with a non-exhaustive list of Benefits and Concerns:

Integration Styles Native App Connector
Benefits
  • Easily accessible in the sales process context.
  • Seamless integration with other native apps.
  • Has gone through Salesforce security review.
  • No data latency.
  • Easily accessible in the sales process context.
  • Access is managed within Salesforce.
  • Has gone through Salesforce security review (only if installed through App Exchange).
  • Control over storage impacts.
  • Broader range of vendors to choose from.
Concerns
  • May require additional Salesforce licensing.
  • May have impacts on storage limitations.
  • Frequently limited functionality.
  • Support may require coordinating the vendor and Salesforce.
  • High potential for latency.
  • Difficult to trouble-shoot.
  • Users must use multiple applications.

Of course, the next question is usually “which is best”, and I must respond with the architect/consultant/writer-needing-higher-word-count with “it depends”. And it depends on lots of things, such as who will be maintaining the solution; how are capex and opex prioritized and managed; how do different stake holders actually need to interact with the solution; and is it clearly understood that this only one aspect of a vendor selection process and all known aspects must be documented and weighted before giving a recommendation?

The real reminder for me when I finished this brief analysis was that context is everything when doing any type evaluation. The list that I started with included products that were questionable as to whether they really belonged in the report and many of the products were listed as serving domains that there was no mention of on the vendor’s site and no compelling reason why the unmentioned-domain would want to use it. If I had direct access to the author(s) I may learn something by asking, but the important thing is that I used their input only as a starting point and applied my own analysis because when the recommendations are provided to a client, those author’s name will not be on the agenda and they will not be there to answer the questions that hadn’t yet been thought of.

Facebooktwitterredditlinkedinmail
© Scott S. Nelson