Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 

Recently I was asked a question by my Customer's Management during our cut-over from Primary to Standby in a warm standby production environment.  They asked me to ensure that the data was the same and that the number of stored procedures matched exactly from the Primary to the Standby.

The time constraints to do this was under 10 minutes.  They mentioned nothing about re-syncing strategies, just run a report that gives them the  assurance that the data items (tables and procs) are in sync.  It was a legitimate question posed by Business Users who are unfamiliar with the underlying technology of the SAP SRS (Replication Server) product.

Typically we assume that if replication is up and running with no issues then we can say that the data on the Standby side must match the Primary side.  When we architect a warm standby environment we try to ensure that Database Clients do not connect to the Standby side and make changes.  If your environment does not have these checks and balances for ensuring security or your business regulations indicate that this is a fundamental need, then a scheme for checking data assurance would be recommended.

Here are three options for checking data assurance:

  1. Quick and easy SQL check
  2. Using the rs_subcmp tool
  3. SRS Data Assurance option

Quick and easy SQL check

At some point in your DBA life you have probably written SQL statements that do just this.  Shown is a simple statement that produces a number of select statements that in turn are run to check the objects and record counts are the same:

  • -- run this and then cut and paste the output to run for the final result set nocount on   
  • -- tidy up the output go
  • print 'use <your database>'
  • print 'go'
  • select "select count(*) from " + convert(varchar,name) + char(10) + "go" from sysobjects where type = "U" order by name
  • go

The runnable output would look like this....

  • use <your database>
  • go
  • -----------------------------
  • select count(*) from <table1>
  • go
  • select count(*) from <table2>
  • go

We run this command in both Active and Standby ASE Servers and use a "diff" command to compute the differences.  While this does do the job on a certain level, you may have correctly surmised that it does not go deep enough: a row is a row but that row could be filled with garbage and this simple routine would not pick this up.  We could go further and write in-depth SQL that checks the data itself, however lets not re-invent the wheel when we have tools readily available. Which brings us to option 2: Using the rs_subcmp tool.

Using the rs_subcmp tool

All SAP SRS installations come with the rs_subcmp (named subcmp in Windows) client executable that compares row count and contents for two SAP ASE Servers; one being designated as the Primary (the master copy of the data) and one Server being designated as the Standby.  This operates on a "table by table" basis.  There exist a myriad of scripts that will build command line runnable rs_subcmp scripts from your existing databases.  This was the tool I used when my Customer wanted to determine the strength of replication so that the data on the Standby matched the Primary.  Total runing time for multiple databases with a Terabyte of data was 5 minutes; very fast. The rs_sbcmp tool is typically used for user table checks and not system-level table checks. 

Under no circumstances would one use rs_subcmp to reconcile any system tables in the SAP ASE databases.  I felt relatively confident that the procedures were 100% in-sync as the Standby was derived from a dump of the Primary.  While it could be possible to do a subcmp "visual check only" routine on the systext table where the procedure code is stored, personally I have not heard of a case where this has been done.  Perhaps a better method of deriving the procedure code is to use defncopy or even Power Designer and do the same "diff' command featured earlier.  At some point the effort involved might outweigh the final outcome and to reduce this effort and to automate our efforts we present option 3: SRS Data Assurance.

SRS Data Assurance

This licensed Replication Server option compares data and schema between the primary and replicate SAP ASE Servers and reports any found differences.  An important point: This option allows us compare the two SAP ASE Servers during active replication.  We call this type of comparison an “wait and retry” method where any discrepancies are retried later to make sure this was not a result of a record being in the replication queue.The comparison jobs are configurable with respect to data being looked at or compared, reports being generated and monitoring.  We can run jobs in real-time or schedule jobs to run at a specific time.Here is an example of a sample Data Assurance (DA) architecture.

Reference: http://infocenter.sybase.com/help/topic/com.sybase.infocenter.dc01637.1571/doc/pdf/da_ig.pdfThe workhourse Server is named the DA Server.

We also need DA Agents to connect to the SAP ASE Servers if the SAP ASE Servers are physically remote fro the DA Server (which will most likely be the case).  In a nutshell, the roles and responsibilities of the DA Server are:

  • Compares data rows and database schemas
  • Creates jobs and schedules jobs, checks status and generates reports
  • Creates reconciliation scripts
  • Creates Data Manipulation Language (DML) commands for automatic reconciliation

The last two items are very important: reconciliation scripts combined with any DML.  This is a savings in effort for us.

Obviously one has to question why we are getting records out of sync in the first place.  For that we need to look elsewhere other than the Replication Server itself.  Lost records are typically due to a poor implemented architecture: for example Client applications sneaking transactions in on the replicate side.  A simple fix for this architecture would be to ensure "dbo use only" is enforced on any replicate databases.

This Field Notes Series is dedicated to observations from the field taken from personal consulting experiences.   The examples used have been created for this blog and do not reflect any existing SAP Customer configuration.