User Guide

Incremental Sync Mode

The default sync behaviour of Data Sync is load all records from the source and target then compare the data to workout the differences. Once your data becomes very large this can take some time and is no longer ideal.

Incremental Mode provides a solution which is still state-less can be run again and again without issue.

In Incremental Mode you return a subset of data you want to check/sync against the target, for example all records created/modified today. Data Sync will then load the matching records from the target based on the key values of the source records. The same Data Compare process then runs and ADD/UPDATE actions are created as necessary.

Limitations

  • Single Key Column. (no composite key)
  • DELETE actions cannot be created.
  • Source Records should be less than 100k records.
  • Data Connector must support Incremental Mode.

Since we are only seeing a subset of the Data, there is no way to work out DELETE actions.

Ideally the source record set should be small(ish), since we need to match the keys on the target this could be slow. We're clever about it so it's actually very fast we use batching and multiple threads to request all the records in parallel.

Supported Data Connectors

  • Microsoft SQL Server
  • ODBC Connections
  • OleDb Connection
  • Microsoft Access
  • Microsoft SQL Compact Edition 4.0
  • SqLite
  • Microsoft Dynamics CRM (Entities)
  • Microsoft SharePoint (Client API)
  • Microsoft SharePoint Online (Office 365)
  • Azure Table Storage
  • Amazon S3
  • Amazon SimpleDb
  • Podio Items
  • Simego WebStore List
  • Active Directory V2

Enable Incremental Mode

Set the Project SyncMode to SyncAtoBIncremental or select Incremental from the Sync Mode.

Set Incremental Mode

This then changes the behaviour of Data Sync to load only matching records from the Target based on the Key column selection in the schema map. Because of this Incremental mode should only be used when.

  • Target Data Set Contains many thousands or millions of records
  • Source Data set is relatively small > 20k records.

If the source returns a large data set then the performance of incremental mode is not optimal due to the round trips required to the target to retrieve the records, Data Sync does optimize this by using batching and multiple threads where it can.

Recommendations

  • Ensure the Key column in the target is an Indexed field.
  • Lookups may still cause issues if the Target Lookup Data Source is large, consider using LOOKUPAINCREMENTAL and LOOKUPBINCREMENTAL. However since these round-trip on each lookup value it is recommend only when your source contains very few records.
  • Use Project Automation to re-write your source filter at runtime to keep the source records small.
public override void Start()
{
	DataSourceA.FetchXmlFilterExpression = DataSourceA.GetFetchFilterXmlForModifiedSince(DateTime.Today);
}

How it could be used

This mode of operation would usually be used with a View or CRM FetchXMLFilter Expression that limits the source records.

When Incremental mode is used with DS3 Automation Server (Ouvvi) you can dynamically adjust the source filtering based on a value supplied by the Server. The server will inject Project Properties into the Project Properties collection.

The property Auto_LastSuccessfulRun value is the Timestamp from DS3 Server when the project was last run.

Property Collection

If you combine this with Project Automation you can adjust your data source properties at runtime, for example to update the Where clause of a SQL Provider you might use the code below in Project Automation.

Project Automation