Integration - Primer connects to the transaction systems (CRM, Order Mgmt Systems, etc.) which contain your customer data and all other data and metrics that you plan with.
Let's look at the relevant topics around this integration.


Contact Us

If you have a particular set of systems or scenarios not listed above, please feel free to contact us. Integration generally requires working with one of our GrowthOps Business Partners who will provide you the guidance necessary.

Data Pipeline

The applications use a data pipeline to move data between varius source and destination systems. The configured integration layer moves the data reliably and uses a powerful Hadoop based infrastructure.

The data pipeline comes with pre-defined connectors to well known systems and we are always adding support for more. Please see the list below for the supported list of source and destination systems:

In addition to the above, the data pipeline can interface with any well defined RESTful endpoint to extract and deposit data.

Custom Integrations

To Integrate with any custom data sources you may have or to build a custom integration, please contact your GrowthOps Business Partner or raise a Support Request.

Cloud vs On-Premise is a cloud-based application but can integrate with both cloud-based and on-premise data sources and destinations.

To integrate with cloud data sources in a secure manner, there may be setup requirements. Your GrowthOps Business Partner will walk you through this.
Typically this involves OAuth, JWT, SSO and other configuration to secure the access to the data.

For on-premise environments, it may involve additional setup such as firewall restrictions, Application Layer firewalls, etc. Cloud storage platforms such as Google Drive or Microsoft OneDrive can serve as intermediaries to move data in a secure manner as well.

Ad hoc vs Scheduled

Users can set up the data pipeline to extract or deposit data on a scheduled or an ad hoc manner. The typical scheduled processes may run from once a day to a few times a day. Depending on the planning scenarios supported, the data update schedule can be configured.

Planning is typically not a real-time process and therefore it is not necessary to move data in a real-time manner. The data pipeline is therefore a batch process which can be scheduled on a reasonable schedule to update the data.

Users can also set the fullcast application data to extract on a defined schedule or in an ad hoc manner. Sometimes it might be necessary to push data from to a transaction system in an ad hoc manner. This is facilitated through the application UI.

Integration - Primer

Suggested Edits are limited on API Reference Pages

You can only suggest edits to Markdown body content, but not to the API spec.