Human Capital Management Blog Posts by Members
Explore blogs from customers or SAP partners to gain best practices and fresh insights to succeed.
cancel
Showing results for 
Search instead for 
Did you mean: 
Berg_Song
Active Participant
513

This post shares a practical integration case on SuccessFactors MDF Payroll Result (Pay-slip) Integration. It covers how to design MDF objects to store pay-slip data from third-party payroll systems, the integration architecture, and global payroll result analysis design, along with key considerations and limitations. While it appears simple, complexity arises with large data volumes, multiple payroll systems, and diverse data structures.

Background

In this case, the customer is a multinational enterprise using SuccessFactors as a global HRIS, but with different payroll systems per region. As payroll data resides in local systems, the headquarters lacked consolidated insight into workforce costs, limiting analytics and planning. Therefore, centralizing payroll data became essential.

  • Global HRIS: SAP SuccessFactors(Employee Master Data Source)

  • Third-party payroll systems:

    • ADP (US, Mexico)
    • BIPO (APAC: Singapore, Thailand, Vietnam)
    • Payroll B (Germany)

The flow is bidirectional — employee master data moves from SAP SuccessFactors to payroll systemsand pay-slip data flows back into SuccessFactors.

Integration Center / SFTP acts as the middleware.

Pay results land in MDF objects (PayResultParent, PayResultItem) and feed into Story / People Analytics for HQ insights.


Business Requirement Analysis

The table below summarizes the requirements, reasoning, implementation approach, and complexity level.

RequirementCategoryReasoningHow to Achieve Difficulty (1–5) Notes
UI must be user-friendly; HR and employees can view pay slips in Employee ProfileMDF DesignMultiple records per month, with history & audit trailUse effective-dated parent-child MDF entity (version history = completed)2 
Non-eligible allowance/pay result items must be deleted in the relevant monthIntegrationPrevent inheritance of obsolete records
• Use API deletion method:
 purgeType=record for parent MDF;
 purgeType=full for child MDF in Integration Center
5Avoid pushing zero-amount records and invalid record — it creates excessive invalid data.
Analysis should support both local and global pay result itemsMDF DesignMapping between local and global pay result items is neededUse MDF (not Picklist); create custom attributes for analysis3 
Pay amounts should support analysis by month, BU, department, job grade, etc.MDF DesignRequires linking MDF with employee master dataParent MDF externalCode field type = User2 
Integration should support re-updates (back pay or corrections)MDF DesignexternalCode should be meaningful for referencingUse local payroll record ID or rule-generated code (e.g., userId+effectiveDate+allowance)4SuccessFactors External Code Design Best Practice

MDF Table Design

TableFieldNameField Type
cust_PayResultParentapplicantUserAutomatically populated
cust_PayResultParenteffectiveDateDate 
cust_PayResultItemexternalCodeStringUse local payroll record ID for cross-system lookup; if null, auto-generate by concatenating UserId, AllowanceCode, PayrollDate.
cust_PayResultItemcurrencyGOUse standard SF currency MDF
cust_PayResultItemlocalPayResultItemGO 
cust_PayResultItemamountDecimalUse decimal for payroll amounts
cust_PayResultItempayEffectiveDateDateSupports back pay scenarios (e.g., overtime in July paid in September).
cust_PayResultItemamountConversionDecimalAuto-calculated by business rule referencing exchange rate

Not Recommended Design: Using a single MDF for multi-country company with large amount of actual pay component is not scalable or analysis-friendly.

TableField NameTypeNotes
cust_PayResultapplicantUserAutomatically populated
cust_PayResulteffectiveDateDate 
cust_PayResultbaseSalaryAmountDecimal 
cust_PayResultcarAllowanceAmountDecimalToo many predefined fields required
cust_PayResultcurrencyGO 

Berg_Song_0-1759919831336.png


Integration Design

Payroll data can be integrated via API or SFTP.
SFTP is generally recommended because it is auditable (via stored flat files), simple in structure, and suitable for low-frequency data transfers (weekly or monthly).

1. API Integration

In case API integration is required due to security necessity or other, I also introduce API-based integration menthod. In this method, avoid inserting parent and child MDF records in one request. Use two separate upsert requests — one for the parent and another for the child — for clarity, completeness, and broader parameter support. Below is a example of single MDF payload request:

{
  "__metadata": {
    "uri": "https://api4preview.sapsf.com/odata/v2/PaymentInformationV3",
    "type": "SFOData.PaymentInformationV3"
  },
  "effectiveStartDate": "/Date(1753432694289)/",
  "worker": "Berg01",
  "toPaymentInformationDetailV3": {
    "results": [
      {
        "__metadata": {
          "uri": "https://api4preview.sapsf.com/odata/v2/PaymentInformationDetailV3",
          "type": "SFOData.PaymentInformationDetailV3"
        },
        "PaymentInformationV3_effectiveStartDate": "/Date(1753432694289)/",
        "PaymentInformationV3_worker": "Berg01",
        "paySequence": "0",
        "cust_bank": "BBVA"
      },
      {
        "__metadata": {
          "uri": "https://api4preview.sapsf.com/odata/v2/PaymentInformationDetailV3",
          "type": "SFOData.PaymentInformationDetailV3"
        },
        "PaymentInformationV3_effectiveStartDate": "/Date(1753432694289)/",
        "PaymentInformationV3_worker": "Berg01",
        "paySequence": "0",
        "cust_bank": "CHNN"
      }
    ]
  }
}

2.Integration Center (SFTP)

Parent–child MDF integration cannot be handled in a single job. Below shows the comparison of purge types in different operation environment for referrence:

 

Import and Export DataIntegration Center Odata APIResult
Parent Object = Full PurgeParent–Child payload in one API (Full Purge)Parent–Child payload in one job (Full Purge)All existing effective record cleared and the payload data inserted
Parent = Incremental, Child = IncrementalParent = Incremental, Child = FullpurgeType=record (parent MDF upser API endpoint), purgeType=full (Child MDF upser API endpoint)Old records cleared in the new effective record, new data inserted in the new effective record
 Parent = Incremental, Child = IncrementalpurgeType=incrementalPrevious records inherited in the new effective record

Although two jobs are required, the third-party system can still produce one file, as overlapping fields exist between parent and child MDFs and SuccessFactors integration job could consume the single job.

In addition, it is recommended to set source page size to '1000' to make it stable and efficient.

Incorrect Setup (Single Job):

Berg_Song_1-1759919880786.png

Correct Setup:

1.Parent Job:

Berg_Song_2-1759919912430.png

2.Child Job:

Berg_Song_3-1759919963320.png


Story and Analysis

Currency Exchange:
Avoid calculating via Story Concatenation Columns — large datasets exceed system limits.
Refer to Guardrails for People Analytics Story and SAP Note 3025688.
Each query supports:

  • Up to 120 columns
  • 30 tables
  • 1,000,000 cells per query

Monthly Aggregation:
To analyze sums per month, use a calculated column (Payroll Month) to convert MM/DD/YYYYYYYY-MM with Year() and Month() functions.
This handles varying pay frequencies (e.g., bi-weekly in the US vs. monthly in Singapore) efficiently via pivot aggregation.

Historical Record Retrieval:
Use Time Filter for all historical MDF data but “Today” filters for navigation tables (Job Info, Personal Info, etc.) to avoid duplication and incorrect joins.

Berg_Song_4-1759920043294.png


Summary

This project demonstrates a complex yet practical scenario integrating MDF, API, Integration Center, and Story Reporting. The challenges stemmed from data volume, multiple payroll systems, and sensitivity concerns. It took approximately two months for me to complete the full implementation, so I hope that this guide helps you reduce your learning curve for similar integration requirements.
For detailed questions or discussion, feel free to comment below or contact me in LinkedIn.