Governance & Best Practices

Hi all,
I am relatively new to Catalytic (have been mostly using automation anywhere so far), and I am interested in getting some advice regarding best practices for automating workflows and what can be considered good governance. I am currently trying to build a framework that will be common for our users of Catalytic in the department to avoid scaling without any guidelines that might cause trouble in the future.
Just to clarify, an example of governance & best practice for me would be:
Governance: following a certain task/variable convention, using conditional blocks, use a workflow template etc.
Best practice: implement in email footer . --> to retrieve the email ID, implement notes and headlines etc.
I would really appreciate any inputs you have, I know practical experience dictates what are best practices and governance you can use when implementing a new tool.
Lastly, does anyone have experience with implementing some sort of error handling in Catalytic workflows?
Thanks a lot in advance!
Best Answer
-
Hi @Yoana_209021,
Here's a handful of general best practices for building workflows in Catalytic, written by one of our internal builders, @Kevin_129457.
This list is not all-encompassing, so please ask us any specific questions you may have.
Catalytic Building - General Best Practices
Be Aware of System Limits
Build your workflow to operate within the current system limits
- Review the current documentation for Catalytic system limits
- Review documentation for any third party systems
Iteration and Version Control
Make Use of Version Control
- See our help site documentation for version control
- The published version should ALWAYS be free of any defects. Never publish an untested draft
- Make small changes. In other words, publish early and publish often
- Consider adding a changelog to the workflow description before publishing a draft
- Consider having a review process before publishing a draft
Monitoring and Error Handling
Add Logical Breakpoints with Manual Review Tasks
After a set of automated steps executes, consider manually verifying the data with a human task like Assign Task to a Person. This is meant to catch processing errors before passing data to downstream steps.
Common scenarios include:
- Pauses to review a table or spreadsheet before starting a batch
- Pause to review a PDF before and after processing in OCR steps
- Pause to review the contents of an email before approval email steps
💡 Consider removing the manual review task in a future iteration of the process after testing thoroughly with sufficient data
Naming Conventions General Guidelines
Use whatever naming convention that works for you, but be consistent
Use Pronounceable Names
vsMake Meaningful Distinctions
How do you differentiate between these tasks at a glance?
Use Intention Revealing Name
A name should tell you why it exists, what it does, and how it is used. If a name requires an explanation, then the name does not reveal its intent.
Naming a field "d"
vs
Naming a field "elapsedTimeInDays"
Make Workflows/Instances/Fields Easily Searchable and Traceable
- Use the Workflow: Rename this Workflow action to change the default instance name
- Consider including something unique like the ID of the parent instance. This will make debugging easier since you can use the UI to search by the parent and find all related workflows.
Pick One Word per Concept
Use the same concept across the entire workflow
Fetch Value
vsGet Value
vsRetrieve Value
If you named a module that returns data as Fetch Value, use the same concept throughout the process. Using Get Value or Retrieve Value will confuse the reader.
General Things to Avoid
- Don't use initials, abbreviations and codes that are not commonly understood
- Avoid unnecessary repetition and redundancy
- Don’t add gratuitous context. Short names are generally better than longer ones, so long as it is clear, don’t add context to a name that does not need it
Build Testability Into the Process
Have a Boolean "Testing" Flag for Specific Workflow Steps
Have control over which steps you'd like to skip in your workflow for testing purposes.
- Create a True or False type field in your workflow's Instance Fields.
- Add conditions to any actions you may want to include/exclude in your testing.
- Configure your workflow's action conditions to evaluate whether this testing field ==
true
orfalse
.
Use case examples:
When your Testing field ==true
- Skip or execute specific actions within your Workflow to help with debugging
- Use test data instead of "live" data (e.g. a data table with values for testing purposes)
- Skip API actions or actions that rely on external systems
- Output data is sent to a specific recipient/location (add all testing data to a "Test Results" data table, Email results table to the tester)
When your Testing field ==
false
- Workflow operates normally, actions built specifically for testing purposes are skipped
Plan How to Clean Up Test Data
If your workflow is going generate large amounts of test data, consider automating this with a separate workflow
Build in a way to "reset" your input data
- Example: If your workflow is editing a data table with new values, build in a way to reset that table
Remove test instances from Master Tables
- Build a clean up workflow with the Workflow: Update a completed run to test mode action
1