Digioh Connect

Digioh Connect is a powerful service that can enable complex integrations in a low-code or no-code solution. It can be used to connect your Customer Relationship Management (CRM) platform with your Email Service Provider (ESP) and keep data in both platforms in sync. For example, if a new contact is created in your CRM, the same contact is created in your ESP automatically, or vice versa.

Connectors can also be used to build custom logic for an integration. For example, we can update a contact in your ESP to be a certain type based on qualifying data in the corresponding CRM contact. Or, we could look up additional related data in your CRM and attach it to the ESP contact. We can also transform the data between the two systems to meet data requirements of those systems or your business requirements.

There are two main modules in Connectors

– Pipelines

– Jobs

Pipelines:

 

A pipeline is a set of tasks that are executed in a specific order. Tasks can have conditions that determine whether they should run or not. They can also have sub-tasks that are only run if the parent task runs successfully to completion.

 

Pipeline Tasks:

Each task has two main fields

  • Name
  • Task Type

– Name:

Set the name according to the task purpose

– Task Type:

These are the currently available task types. New task types are routinely added.

  • Delay
    • waits a configured duration before executing the next task in the pipeline
  • Foreach
    • Perform sub-tasks of Foreach task for reach item in a collection
  • Google Sheets Update
    • Inserts or updates an item in a Google Sheet
  • Http Request
    • Perform an Http Request to another service. This can use configured integrations for authentication.
  • Iterable API
    • Perform a function against the Iterable API such as updating a user, tracking a custom event, or updating a catalog item
  • Map Data
    • Map fields from input data into different fields as output data. Can also perform transforms on the data.
  • Merge Data
    • Merge the results of multiple tasks into a single item
  • Render Template
    • Render a Handlebars, Liquid, or Scriban template
  • Salesforce Create Object
    • Create an object in Salesforce
  • Salesforce Get Object
    • Retrieve an object by Id in Salesforce
  • Salesforce Query
    • Query salesforce for one or more records
  • Salesforce Update Object
    • Update an object by Id in Salesforce
  • Send to the pipeline
    • Send data from this pipeline to another pipeline for further processing
  • Task Notification
    • Send a notification to a supplied list of email addresses
  • Transform Data
    • Transform data to fix errors or ensure proper format for destination.
    • Available transforms:
      • Phone to E.164 format
      • Remove Non Alpha Characters
      • Remove Non Numeric Characters
      • Remove Html Tags
      • Convert to Lower Case
      • Convert to Upper Case
      • Convert to Title Case
      • URL Encode
      • URL Decode
      • HTML Encode
      • HTML Decode
      • Date Time and Seconds (yyyy-MM-dd hh:mm:ss)
      • Sortable Date Time with Offset ((yyyy-MM-dd hh:mm:ss +-hh:mm)
      • Date (yyyy-MM-dd)
      • Date Slashes (MM/dd/yyyy)
      • Replace
      • Prepend
      • Append
      • Convert to boolean (true/false)
      • Convert to Integer (123)
      • Convert to to Float (123.45)

 

Pipeline Activity: 

Activity Shows the number of tasks that are executed in the last

  • 6 hours
  • 24 hours
  • 7 days
  • 30 days
  • 90 days

 

Logs:

In the Activity section, we have logs that show the details of each pipeline item’s execution. Logs are temporary and are not retained. They can be useful for troubleshooting pipeline issues.

Pipeline Configs:

  • Notifications: 

You can enter a comma separated list of emails to send pipeline failures to.

  • Hook URLS:

Hooks are web endpoints that can be used as webhook URLs. They facilitate integration with partners to receive data in real-time to be processed by the pipeline. They can also be used to integration with Digioh Box submissions to send form data right into a Digioh Pipeline.

Jobs:

Jobs are iterative processes that call into an external service to query for updated data. The data is then sent to the appropriate pipeline for processing. Each time a job runs the query uses an updated timestamp to retrieve only the data that has changed since the last time the job ran. Jobs can be configured to run at a defined interval from 5 minutes to 24 hours.

 

Job Types:

There are currently 2 main job types, more will be added in the future.

  • Salesforce Query
    • This job type is used to query salesforce using SOQL queries. It is a powerful tool to pull data into a pipeline
  • Http Request
    • This job type is used to access web based APIs that take parameters as input into the data query

You can add a new job by clicking the “+ Add Job” button. Choose a name, a job type, and the desire interval for the job:

 

Click “Edit” next to a job to navigate to the job editor. There is a different editor for each job type.

 

Salesforce Query Job Editor:

The Salesforce Query Job editor has the following parameters

  • Name
    • A user defined name for the job
  • Status
    •  The job can be set to active or inactive
  • Interval
    • The job will run at this interva, can be set from 5 minutes to 24 hours
  • Pipeline
    • The pipeline to send records to for processing
  • Check Data Hashes
    • If enabled this will create a hash of each found record and to check against subsequent runs so we don’t process the same data more than once.
  • Include Deleted
    • If enabled this will include deleted records in the salesforce database in the query. Default is disabled.
  • Max Records
    • The maximum number of records to process for each run of the job. If more than this number of records are found the job will keep track of the last updated timestamp of the last processed record and start the next job from there.
  • Connection
    • Choose the integration connection to use for this job.
  • Query
    • This is the SOQL query used to retrieve records from Salesforce. There are two special merge tags you can use in the query to ensure only updated data is retrieved. The [INTERVAL_TIMESTAMP] tag will be replaced with either the last run time of this job, or the last processed record if the prior job hit the max records value. You can also use [INTERVAL_DATE] if you only want the date component of this timestamp.
    • We strongly recommend including the SystemModstamp field in your query. If you don’t then the job won’t be able to track the last processed item’s modification timestamp.

Http Job Editor:

The Http Job Job editor has the following parameters

  • Name
    • A user defined name for the job
  • Status
    •  The job can be set to active or inactive
  • Interval
    • The job will run at this interva, can be set from 5 minutes to 24 hours
  • Pipeline
    • The pipeline to send records to for processing
  • Check Data Hashes
    • If enabled this will create a hash of each found record and to check against subsequent runs so we don’t process the same data more than once.
  • Include Deleted
    • If enabled this will include deleted records in the salesforce database in the query. Default is disabled.
  • Max Records
    • The maximum number of records to process for each run of the job. If more than this number of records are found the job will keep track of the last updated timestamp of the last processed record and start the next job from there.
  • Connection
    • Choose the integration connection to use for this job.
  • Request URL
    • The URL endpoint to send the request
  • Request Method
    • The Http method to use on the request GET,POST etc.
  • Request Body Type
    • Set this to the appropriate request body type needed for the URL endpoint
  • Request Body
    • This is the payload of the http request
  • Headers
    • You can add any additional headers required by the URL endpoint
  • Response Body Type
    • This determines how the response will be parsed by the system
  • Single Record
    • If enabled, this tells the system to treat the response as a single record, as opposed to a collection of records
  • Path to Records
    • Assuming a JSON response this is the JSON Path to the collection of records
  • Paging Enabled
    • Some APIs have a paging ability to retrieve data in batches. Enable this option to use paging
  • Page Size
    • The page size to use
  • Page Parameter
    • This is a user defined merge tag that is used for paging. The URL, Request Body, and Headers will have this tag replaced by the appropriate page
  • Page Size Parameter
    • This is a user defined merge tag that is used for paging. The URL, Request Body, and Headers will have this tag replaced by the appropriate page size

You can optionally filter returned records clicking the “Item Filter” button. Here you can define conditions that must be satisfied for the record to be sent to the configured pipeline. This can be useful if the API does not provide native filtering capabilities.

Run Job

You can run a job manually at any time by clicking the “Run” button. This will pop up a window asking to set the [INTERVAL_TIMESTAMP] of your choosing. This will then direct you to the job log detail page.

 

Job Activity:

You can view job activity from the main page by clicking the “Activity” button next to the job.

You can also see detailed logs about how many items were queued and the queries run (but not the records themselves) by clicking the “Logs” button.

 

Pipeline Quick Start

Steps for Creating a Pipeline to create Leads in Salesforce

  • Navigate to the Connectors are of the Digioh Portal

  • Click on add new Pipeline as shown below 

  • Enter Pipeline name

  • Click Edit next to the pipeline name to navigate to the Pipeline Edit Page where you can add the tasks

  • For this example, we will create a Map Data task.

  • Click Edit on the newly created task

  • Add desired mappings, you can add new mappings with the    button

  • Next let’s add a task to create a new Lead in Salesforce CRM

  • Configure the Create Object task with the appropriate object name, and the path to the data to use. TaskResults refers to the output of prior tasks, and 10136 refers to the ID of the Map Contact Fields task we created earlier.

  • Now your pipeline looks like this, let’s test it!

  • Enter test data in JSON format into the test input and hit “Run Test”. You can see the results of your test below:

  • You now have a working pipeline! Congratulations! Next, let’s add some basic conditions to your pipeline
  • Click “+ Condition” on the Creat New Lead pipeline task

Conditions are logic statements that result in true or false and control whether a task will run or not. Add a couple conditions, you can click “+ AND” or “+ OR” to create groups of conditions to be evaluated. Choose the left side as either a field, or a value, then the operator, then the right side as either a field or value.

These conditions ensure that a lead is created with an email. More complex email validation could be added with more conditions, or by using a regular expression matching condition.