Salesforce Process Builder Debugging Mashup

 

No time to write a full post on this, but here are the pieces:

Today a colleague reminded me about a new feature for debugging Flows as another user. This reminded me that recently I had stumbled across Process Builder Flows showing up in a Flow UI, but couldn’t immediately remember how I got there. Long story short, someone else has discovered this debugging Easter egg and blogged about it. So, first read about the hidden Flows created by Process Builder and then the new feature of doing this as another user and go forth and fix all of your production flow issues! Well, more of them, anyway 🙂

If you found this interesting, please share.

© Scott S. Nelson

Simplifying the Off-Facebook Settings

Given that the media outlets reporting the steps to opt out of Facebook tracking also use Facebook tracking, the articles make it really complicated to get rid of all of it. So here is the simplified version:

  1. Go to https://www.facebook.com/off_facebook_activity/activity_list (you may be prompted for you password even if logged in)
  2. Click Manage Your Off-Facebook Activity
  3. Click Manage Future Activity link
  4. Click Manage Future Activity button
  5. Toggle off Future Off-Facebook Activity (if you like some of the tracked ads, you can manage them individually, but you will also have to do that regularly)
  6. Go to https://www.facebook.com/off_facebook_activity/activity_list (yes, again, and you may again be prompted for you password even if logged in)
  7. Click Clear History link
  8. Click Clear History button

Cheers!

If you found this interesting, please share.

© Scott S. Nelson

How to set up self-registration in a Salesforce community

This is the unedited version of the post on my employer’s site at https://www.logic2020.com/insight/tactical/how-to-set-up-self-registration-in-a-salesforce-community

Salesforce Communities are a really cool way to interact with your customers in the context of your product or service. Like most good portal products, it has a lot more features than most people will ever use. The upside to that is if you think of something you think your portal should have, odds are pretty good it does. Downside is that it can sometimes be onerous find all of the details you need to make something work. Community Self-Registration is just such a feature. While I can describe the details in under a page (with the two screens removed and linking to one page of documentation directly from Salesforce), it took me three hours of reading blog posts, user community threads and Trailhead training entries to get all of these steps down and working. To make sure I had it all correct, I then spent 20 minutes repeating the process in a different org from scratch and writing up this article.

With all that said (which almost doubled the content), here are the steps…

First, create the account you will use for assigning users to. This should be an account specifically for this purpose for ease of reporting and managing accounts.

Next, make sure your profile has a role (the default Developer org does not have one assigned).

Clone the Customer Community Login User profile for the profile you will use for community members that self-register.

From Setup, open the community from All Communities > Workspaces. Go to Administration and Members and add all profiles that should have access to the community, including the one you just created (you may need to use the All option from the dropdown for them to show).

SFDC Community Member Management

Remember to scroll to the bottom of the page and click Save when done.

Now you can follow the instructions at Use the Configurable Self-Reg Page for Easy Sign-Up, but before you test your login, make sure that you have activated your community from the Workspaces > Administration > Settings page.

One other hint: If you get errors, open the Developer console and the Log panel and try again to see what the error is. Next article will be how to customize the login page since your user probably don’t have access to the developer console to find out their password must include a symbol ?.

If you found this interesting, please share.

© Scott S. Nelson

Get Hands-on with VS Code, Salesforce DX and Packages

(Originally published at Logic 20/20 as SFDX, VSCode, and deploying from a package the editors stripped out all of the links, rendering it an entirely different post. This is the original version.)

While I do not immediately dislike new tools, I do struggle with adopting them when I find nothing wrong with the old ones. And then I delay learning them until I’m forced to, which is the case of Visual Studio code for Salesforce (they are no longer supporting the Eclipse IDE and abandoned the DX extension for Eclipse before DX went GA) and Git (because that is the way the dev world has gone). I find the best way to learn new tools is to write about how to learn them, so here we go.

(In the spirit of working in a low code platform, we will also see how much of this I can do with just links to existing documentation…)

If you haven’t already, Install Salesforce Extensions for VS Code.

Then Enable Dev Hub in Your Org and Enable Second-Generation Packaging (note that while 2GP is beta as of this writing, this is required to enable first generation Unlocked Packages which is GA).

Next…Well, that didn’t take long. I cannot find a stand-alone URL for creating an SFDX project, so I’m going to steal a section from a Trailhead lesson (because it is as much typing to say what not to do as it is to re-create it here):

  1. Open VS Code.
  2. From the menu, select View | Command Palette.
  3. In the command palette search box, enter [PROJECT_NAME].
  4. Select SFDX: Create Project.
  5. Use the same name as your GitHub repo, then click Enter.
  6. Click Create Project.
  7. Create a .gitignore file to ignore hidden directories:
    1. Hover over the title bar for the DX project. then click the New File icon.
    2. Enter .gitignore. [check if it already exists and just edit if so]
    3. In the text editor, indicate to ignore these two hidden files:
.sfdx
.vscode

To foster good habits, I will set up a github repo to store this project in (though following a full lifecycle will be another article) by following the excellent documentation at https://help.github.com/en/github/importing-your-projects-to-github/adding-an-existing-project-to-github-using-the-command-line and add the project to the repository.

Now go do some work in Salesforce. For example purposes, let’s do the Build a Simple Flow project.

After you complete the project, follow the instructions to Create and Upload an Unmanaged Package, skipping the Upload part. I named the project TH_Flow_Project, which you don’t have to, I only mention that as I will use that text in the example commands.

Salesforce provides a nice reference to Create a Salesforce DX Project from Existing Source.  I have some additional thinking around how to go about this part, so I will end the approach of linking to references and switch to my own approach. If you followed the last link and stopped here, you won’t learn anymore about the Salesforce DX capabilities, but you may miss out on some of my shortcuts and wit. With that said…

Authorize the org you created the flow in with the following:

sfdx force:auth:web:login –setalias <MY_SOURCE_ALIAS> –instanceurl <MY_ SOURCE_ORG_URL>

Example:

sfdx force:auth:web:login --setalias TH-ORG02 --instanceurl https://infa-ca-wav-dev-ed.my.salesforce.com/

A bit late to mention, but if you are using a Developer org, I highly recommend to Set Up My Domain. Trailhead orgs already have one. If you haven’t, you can probably leave off the instanceurl parameter and it should pick it up from the default configuration for your project (YMMV). Otherwise use the URL that you login to your org with.

Next, download the package using the following:

sfdx force:mdapi:retrieve -r ../ -p <PACKAGE_NAME> -u <USERNAME>, ex:

sfdx force:mdapi:retrieve -r ../ -p TH_Flow_Project -u scott@trailh2.org

Let’s break that down just a bit. The first part is the base command to retrieve (sfdx force:mdapi:retrieve). The -r parameter determines where the downloaded zip file will be located. The example uses a relative path indicating the folder above the DX project. As a best practice, I recommend always staying in the project directory inside the VSC terminal, with all commands base on being relative to that location. This way you can maintain a list of commonly-used commands that will be re-usable across all projects. The downloaded file name is always unpackaged.zip.

The files need to be unzipped before they can be used (someone should make a feature request for the convert command to work on zip files instead of having to unpack them first). In Linux the relative command is:

unzip ../unpackaged.zip -d ../

Now we add the files from the package to our project using the relative path command:

sfdx force:mdapi:convert -r <PATH_TO_UNZIPPED_PACKAGE> -d <PATH_TO_[/force-app]>, ex:

sfdx force:mdapi:convert -r ../TH_Flow_Project -d force-app

Now all of the files from your package are part of your project.

To add this to your target org, first authorize that org as done previously for the source org, i.e.:

sfdx force:auth:web:login –setalias <MY_TARGET_ALIAS> –instanceurl <MY_ TARGET_ORG_URL>

Example:

sfdx force:auth:web:login --setalias TH-ORG02 --instanceurl https://infa-ca-wav-dev-ed.my.salesforce.com/

And (almost) finally, deploy the updates from your project to the target org with:

sfdx force:source:deploy -u <TARGET_USERNAME> -x <PATH-TO-PACKAGE.XML>

sfdx force:source:deploy -u apex@theitsolutionist.com -x ../TH_Flow_Project/package.xml

(Another feature recommendation is to have an alias option instead of only the username.)

And finally (this time for real!) look in your list of flows to see the flow installed in your target org.

Of course, you are doing this with a throw-away org, right? Because I forgot to mention that deploying will over-write any existing components with the same name.

One final note. We used the package.xml from the downloaded package for the sake of simplicity. Once the package import is validated, you will want to combine the package.xml from the download with the package.xml in your project located in the manifest folder of your project.

The project created from the writing of this article can be found at https://github.com/ssnsolutionist/trailhead1

If you found this interesting, please share.

© Scott S. Nelson

Test automation: 3 things you need to know

Test automation — using automation tools to execute test case suites — delivers numerous benefits, including greater time- and cost-efficiency, the ability to run tests unattended/overnight, and a lower risk of integration and production issues. Particularly well suited to automation are test cases that are:

  • New or modified functionalities
  • Business critical
  • Repetitious
  • Tedious for humans to perform
  • Performance sensitive
  • Time consuming

It’s important to recognize that automation can’t eliminate all manual testing because automation is for testing functionality. You still want users doing hands-on testing to ensure the usability of the application.

If your organization is considering automating parts of your testing processes, here are three things to keep in mind.

1. Cost and time requirements will be lower than you think.

Many teams don’t implement test automation because they believe that it takes too much time or too many resources. While this may have been true in the earlier days of test automation, it is no longer the case with modern tools and cloud services. For example, most tools can record user interactions and then allow developers and testers to modify the results for more dynamic testing. Also, most automated build tools can now incorporate many test engines into the build process, catching issues before they are deployed, and can impact live production use.

2. There are no static inputs in the real world.

Teams that do automation testing often use static, hard-coded parameters rather than dynamic parameters. This not only misses mimicking real-world use cases; it also will not properly reflect performance and scalability metrics where caching is in use. Dynamic parameters can either be randomly generated using standard scripting languages or driven from prepared input files. The parameters also need to be as realistic as possible. The use of nonsensical values to populate text fields and simple sequential numbers for complex number fields such as phone or amount can miss validating even the minimal edge cases

It is also important to have in place a procedure to reset databases to pre-test conditions for consistent re-testing.

3. Know when to run the automated tests.

In the Test-Driven Development (TDD) approach, the tests are written before the code. Even if your development team doesn’t follow TDD, writing tests as soon as the interfaces are created and callable can save hours of time spent performing manual functional tests. The amount of testing necessary to get from the first stubbed interfaces to ready-for-deployment into an integration environment is almost always underestimated. Not writing the tests as early as possible consistently results in either increased effort during development, manually testing functionality as the code changes (versus a button click or one-line command to run the automated test), or increased time during quality assurance debugging all of the scenarios that were insufficiently tested during development

Even if you balk at automating functional testing, testing before turning over to QA will save many hours spent in the writing of defect reports, triage reviews, debugging, and re-testing. Most true DevOps approaches include testing with every build-and-deploy cycle.

Conclusion: The bottom-line is about the bottom line.

The time spent developing and maintaining test automation will deliver a positive ROI by reducing the number of production issues and shortening QA cycles. The time it takes to realize that ROI will vary based on complexity and technical culture, though it is often much sooner than anticipated even with the best application teams.

As with most cases of process automation, automating testing is neither a silver bullet nor a one-size-fits-all solution. The value of real user testing is still valid and necessary, as there will always be use cases that are not anticipated by designers, developers, and testers. By taking the time to determine all use cases that can be automated, selecting the tool that best meets your organization’s needs, and leveraging best practices like using dynamic parameters and testing as early as possible, you stand the best chance of improving productivity, shortening QA cycles, building customer trust, and achieving positive ROI.


Originally published at https://www.logic2020.com/insight/tactical/test-automation-three-things-to-know

If you found this interesting, please share.

© Scott S. Nelson