Home > Windows Azure > How to Sync schema changed database using sync framework?

How to Sync schema changed database using sync framework?


I was working on azure database synchronization for the last month and successfully completed the task. In my case both databases, source and target are in the azure. We can also done the sync from azure to local also. Below are the codes for using sync two azure database.
Step by Step process of sync databases using sync framework :
1. Create both source db and target db connection strings
2. Create DbSyncScopeDescription object ‘myscope’ for describing our scope.
3. Add tables to the ‘myscope’ that we need to sync
4. Do the provisioning process for both source db and target db.
5. Set memory allocation to the database providers
6. Set application transaction size on destination provider.
7. Create object for class SyncOrchestrator and call Synchronize() method.

public static void Setup(string sqlSourceConnectionString, 
string sqlTargetConnectionString, 
string scopeName, int scopeid, Scope ScopeDetails)
 {
 try
 {
 SqlConnection sqlServerConn = new SqlConnection(sqlSourceConnectionString);
 SqlConnection sqlAzureConn = new SqlConnection(sqlTargetConnectionString);
 DbSyncScopeDescription myScope = new DbSyncScopeDescription(scopeName);
string connectionstring = ConfigurationSettings.AppSettings["ConnectionString"];
 SqlDataReader myDataReader = null;
 SqlConnection conn = new SqlConnection(connectionstring);
//Adding tables that we need to sync from the source db
 SqlCommand cmd = new SqlCommand("Select TableName from
scheduler_tables where scopeid =" + scopeid, 
conn);
 conn.Open();
 myDataReader = cmd.ExecuteReader();
while (myDataReader.Read())
 {
 DbSyncTableDescription TestSchema1 = 
SqlSyncDescriptionBuilder.GetDescriptionForTable(Convert.ToString
(myDataReader["TableName"]), 
sqlServerConn);
 // Add the tables from above to the scope
 myScope.Tables.Add(TestSchema1);
 }
// Setup SQL Server for sync
 SqlSyncScopeProvisioning sqlServerProv = new SqlSyncScopeProvisioning
(sqlServerConn, myScope);
 sqlServerProv.CommandTimeout = 60 * 30;
 if (!sqlServerProv.ScopeExists(scopeName))
 {
 // Apply the scope provisioning.
 sqlServerProv.Apply();
 }
// Setup SQL Azure for sync
 SqlSyncScopeProvisioning sqlAzureProv = new SqlSyncScopeProvisioning
(sqlAzureConn, myScope);
 sqlAzureProv.CommandTimeout = 60 * 30;
 if (!sqlAzureProv.ScopeExists(scopeName))
 {
 sqlAzureProv.Apply();
 }
 sqlAzureConn.Close();
 sqlServerConn.Close();
Sync(sqlSourceConnectionString, sqlTargetConnectionString, ScopeDetails,
 instanceID);
 }
 catch (Exception ex)
 {
 throw;
 }
 }
public static void Sync(string sqlSourceConnectionString,
string sqlTargetConnectionString, 
Scope ScopeDetails)
 {
 try
 {
 SqlConnection sqlServerConn = new SqlConnection(sqlSourceConnectionString);
 SqlConnection sqlAzureConn = new SqlConnection(sqlTargetConnectionString);
SqlSyncProvider RemoteProvider = new SqlSyncProvider
(ScopeDetails.ScopeName, sqlAzureConn);
 SqlSyncProvider LocalProvider = new SqlSyncProvider
(ScopeDetails.ScopeName, sqlServerConn);
//Set memory allocation to the database providers
 RemoteProvider.MemoryDataCacheSize = MemorySize;
 LocalProvider.MemoryDataCacheSize = MemorySize;
//Set application transaction size on destination provider.
 RemoteProvider.ApplicationTransactionSize = BatchSize;
//Count transactions
 RemoteProvider.ChangesApplied += new EventHandler
(RemoteProvider_ChangesApplied);
SyncOrchestrator orch = new SyncOrchestrator();
 orch.RemoteProvider = RemoteProvider;
 orch.LocalProvider = LocalProvider;
 orch.Direction = SyncDirectionOrder.Upload;
String syncdetails;
 syncdetails = ShowStatistics(orch.Synchronize());
 sqlAzureConn.Close();
 sqlServerConn.Close();
}
 catch (Exception ex)
 {
 throw;
 }
 }
public static string ShowStatistics(SyncOperationStatistics syncStats)
 {
 string message;
 syncStartTime = syncStats.SyncStartTime.ToString();
 message = "\tSync Start Time :" + syncStats.SyncStartTime.ToString();
 //Console.WriteLine(message);
 syncEndTime = syncStats.SyncEndTime.ToString();
 message = message + "\tSync End Time :" + syncStats.SyncEndTime.ToString();
 //Console.WriteLine(message);
 message = message + "\tUpload Changes Applied :" +
syncStats.UploadChangesApplied.ToString();
 //Console.WriteLine(message);
 message = message + "\tUpload Changes Failed :" +
syncStats.UploadChangesFailed.ToString();
 //Console.WriteLine(message);
 message = message + "\tUpload Changes Total :" +
syncStats.UploadChangesTotal.ToString();
 //Console.WriteLine(message);
 message = message + "\tDownload Changes Applied :" +
syncStats.DownloadChangesApplied.ToString();
 //Console.WriteLine(message);
 message = message + "\tDownload Changes Failed :" +
syncStats.DownloadChangesFailed.ToString();
 //Console.WriteLine(message);
 message = message + "\tDownload Changes Total :" +
syncStats.DownloadChangesTotal.ToString();
 //Console.WriteLine(message);
return message;
 }

The above codes are working fine until any schema changes occurred in the source database. If we made any schema changes on the source sync will be failed.
In my experience, I have added a new column in the source db called ‘isActive’ and got error once I trying to sync again “invalid column name isActive’”.
The cause of the issue is when we sync first time, will created so many tables, stored procedures and triggers by sync framework to identify that which areas to be taken sync in future or which areas having changes after the previous sync.
So we need to clear all these data related to sync, then only it will take any schema changes in the source db. For delete this records Microsoft providing a class SqlSyncScopeDeprovisioning and method DeprovisionStore() for removing all these data. So if we have any schema changes in the source database, we need to create a object of the class and call this method before applying the provision. Below are the codes to de provision the entire db,

SqlSyncScopeDeprovisioning deprovisioningvar = new SqlSyncScopeDeprovisioning(sqlServerConn);
deprovisioningvar.DeprovisionStore();

Then it is working fine ..! The problem is that it will clear all the data regarding the previous sync so it will take long time to sync first time, for the second time onwards there is no need to do the deprovison until any new schema changes occurred.

About these ads
  1. May 15, 2012 at 12:55 am | #1

    First thank you for the great explanation with code on this topic. I wanted to ask if you have had success in syncing schema’s without losing data from the client database. I understand how we can deprovison but I would like a solution without data loss.

  2. May 16, 2012 at 2:27 pm | #2

    Hi,
    Without losing data, we can sync data from azure environment or from on premises.
    Microsoft itself provide a new option in windows.azure.com interface (‘Database link’). If you dont have any account in azure cloud, inform me I will help you with the steps.

  3. July 27, 2013 at 9:42 am | #3

    Useful info. Fortunate me I discovered your web site unintentionally,
    and I’m surprised why this accident did not took place earlier! I bookmarked it.

  1. May 12, 2011 at 2:41 am | #1

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.

Join 42 other followers

%d bloggers like this: