Human Capital Management Blogs by SAP
Get insider info on SAP SuccessFactors HCM suite for core HR and payroll, time and attendance, talent management, employee experience management, and more in this SAP blog.
Showing results for 
Search instead for 
Did you mean: 
Product and Topic Expert
Product and Topic Expert
Greetings Solution Architects and ABAP Developers!

When I presented the program described below to my manager – a program designed to transfer data from ECP to EC – her remark was, "You're swimming upstream." That comment struck a chord. Today, I invite you to embark on a technical journey that echoes that feeling. We'll explore the process of "upserting" Non-Recurring Pay Components in EC. Even though the process is straightforward, it can often feel counterintuitive as we're moving data from our well-understood downstream systems to their upstream counterparts.

It's worth noting that while our focus is on a specific use case, the underlying logic can be adapted to update various EC Employee data points. Whether it's updating the Payroll ID in the EC Employee Profile Compensation Portlet, the PayScale structure in the Job information of EC, or the Work Schedule in the Job Information of EC, the methodology remains robust.

Lastly, while the journey ahead is charted in ABAP, the principles are transferrable. Whether you're operating in another programming language or using tools like Postman(shown below), the roadmap to successful upserting remains consistent. I am hoping that you are aware of what is OData API. If not, please read SAP SF OData APIs(V2).

In this guide, we'll illustrate how to achieve our objectives using an ABAP Program. Note that this ABAP code is basically a part of larger program, so I am just presenting and sharing the upsert process. You can refine and enhance this program with proper error handling based on your specific needs. To provide a clearer understanding of our approach, here's an overview of our data flow:

The customer has a distinct compensation planning process that demands a robust program to determine employee bonuses. Given specific dependencies, ECP was selected as the platform for these calculations. The challenge we encountered was ensuring that EC received the latest bonus data, which would subsequently update ECP. This procedure is divided into three distinct steps:

Step 1: Our primary topic of discussion.
Step 2: A standard PTP replication between EC and ECP.
Step 3: The usual payroll run process.
For this discussion, we'll delve deeply into Step 1, but it's vital to see how it fits into the broader process.

Data Declarations:

A set of data variables and constants are defined for handling various parts of the request and response cycle, such as:

  • gv_uri: The URI for the OData request.

  • gv_payload: The payload for the HTTP request.

  • Data and string types for dates and other values.

  • JSON-related data types and references for handling JSON responses.

DATA: gv_uri             TYPE string,
gv_payload TYPE string,
go_client TYPE REF TO cl_http_client,
ldate TYPE d,
sdate TYPE d,
edate TYPE d,
date_str TYPE string,
sdate_str TYPE string,
edate_str TYPE string,
sum_tabix TYPE i,
str_bonus TYPE string VALUE ' ', " Initialized to 20 spaces
http_client_instance TYPE REF TO any, " Replace 'any' with the actual class type
request_payload TYPE string,
response_data TYPE string,
json_data_root TYPE REF TO /ui2/cl_json,
json_data_node TYPE REF TO any, " Replace 'any' with the appropriate class or data type
lv_seqnr TYPE zuspy_dg_sb_log-seqnr,
field_ref TYPE REF TO data,
val_ref TYPE REF TO data,
lv_abap_date TYPE d,
lv_date_in_milliseconds TYPE n LENGTH 13,
str_date TYPE string,
lv_result_string TYPE string,
lv_seconds_since_epoch TYPE p DECIMALS 0,
c_epoch TYPE d VALUE '19700101',
c_seconds_in_a_day TYPE i VALUE 86400,
c_milliseconds_in_a_second TYPE i VALUE 1000.

DATA: BEGIN OF json_parsed_data,
editstatus TYPE string,
httpcode TYPE string,
index TYPE string,
inlineresults TYPE string,
key TYPE string,
message TYPE string,
status TYPE string,
END OF json_parsed_data.

FIELD-SYMBOLS: <root_data>,

HTTP Client Configuration:

  • Create HTTP destination in SM59 for EC.

  • Create the http client in our ABAP program using cl_http_client=>create_by_destination method - it creates an HTTP client "go_client" based on a destination configuration.

  • If there's an issue during the client creation, the program exits early.

clear: gv_uri, gv_payload.
call method cl_http_client=>create_by_destination
destination = 'ECP_PTP_110' "SM59 HTTP Destination for EC
client = go_client
argument_not_found = 1
destination_not_found = 2
destination_no_authority = 3
plugin_not_active = 4
internal_error = 5
others = 6.
if sy-subrc <> 0.


Building the URI:

gv_uri is the OData API uri assigned the upsert operation. For more detailed information about edit operations, refer to the documentation.
  gv_uri = '/odata/v2/upsert?workflowConfirmed=false&$format=json'.

The suffix "workflowConfirmed" is used to determine whether a workflow in EC that's typically triggered after that specific EC Object is updated should be suppressed or triggered. Set its value accordingly based on your requirements. For more detailed information, refer to the documentation.

Data Processing:

When updating records, especially if you have multiple entries to deal with, there are two predominant strategies to consider:

  • Loop Through Records Individually: This strategy involves processing each record one at a time. This approach is what's used in the program we're discussing. While it may seem labor-intensive, it's actually presented this way to make the process more understandable for readers.

  • Batch Operations: Another approach is to accumulate all records and update EC in one batch. I'll dive deeper into this strategy, particularly how to leverage the appropriate URI for such operations, in one of my upcoming blogs. In the meantime, you can get more information about it at OData API Batch Operations.

For the current use case, I've used an internal table filled with bonus amounts that require updates. Here's how we process each data record:

  • Date Conversion: The dates in our records need to be in a specific format ("

    /Date(1614643200000)/"). We achieve this using the convert_date subroutine (more about the subroutine at the end of this blog).

  • Payload Construction: We build the payload for the upsert operation using string concatenation, forming the variable gv_payload. This payload contains metadata and several key attributes like payDate, userId, and others.

  • Endpoint Definition: One critical aspect to note is the endpoint specified in the gv_payload. The endpoint is essentially the pointer determining which Employee profile portlet is being updated. In our scenario, we're targeting EmpPayCompNonRecurring.

  loop at it_dg_rptdata_d_summary into wa_dg_rptdata_d_summary where remaining_bonus is not initial.
move-corresponding wa_dg_rptdata_d_summary to wa_zuspy_dg_sb_log.
sum_tabix = sy-tabix.
perform convert_date using p_date changing date_str.
perform convert_date using wa_dg_rptdata_d_summary-inv_begda changing sdate_str.
perform convert_date using wa_dg_rptdata_d_summary-inv_endda changing edate_str.
clear str_bonus.
str_bonus = wa_dg_rptdata_d_summary-remaining_bonus .
concatenate '{"__metadata":{"uri": "EmpPayCompNonRecurring","type": "SFOData.EmpPayCompNonRecurring"},'
'"payDate": "' date_str '",'
'"userId": "' wa_dg_rptdata_d_summary-ecid '",'
'"payComponentCode": "' p_lgart '",'
'"value": "' str_bonus '",'
'"currencyCode": "' 'USD' '",'
'"notes": "' wa_dg_rptdata_d_summary-btrtl '",'
'"nonRecurringPayPeriodStartDate":"' sdate_str '",'
'"nonRecurringPayPeriodEndDate":"' edate_str '" }'
into gv_payload.



Upsertable Field:

Can every field of this endpoint be updated using an upsert operation? Not necessarily. If you examine the metadata of the OData API, you'll encounter an attribute named "upsertable". If its value is true, it indicates that the field can be updated when included in the gv_payload string.

HTTP Request Setup and Execution

  • The HTTP request method is set to 'POST'.

  • The content type for the request is set to 'application/json'.

  • The payload (request data) is set using the set_cdata method.

  • The request is sent using the http_client_instance->send method.

  • The response is received and stored in response_data.

http_client_instance->request->set_method( 'POST' ).
http_client_instance->request->set_content_type( 'application/json' ).
http_client_instance->request->set_cdata( request_payload ).

" Sending the request
http_communication_failure = 1
http_invalid_state = 2 ).

" Receiving the response
http_communication_failure = 1
http_invalid_state = 2
http_processing_failed = 3 ).
CLEAR response_data.
response_data = http_client_instance->response->get_cdata().
CLEAR json_data_root.
json_data_root = /ui2/cl_json=>generate(json = response_data).

Handling the JSON Response:

  • The JSON response is parsed to extract various fields like EDITSTATUS, HTTPCODE, MESSAGE, etc.

  • These extracted values are then used to update internal tables or perform subsequent processing.

	 IF json_data_root IS BOUND.
ASSIGN json_data_root->* TO <root_data>.
ASSIGN COMPONENT `D` OF STRUCTURE <root_data> TO <child_data>.
IF <child_data> IS ASSIGNED.
json_data_node = <child_data>.
ASSIGN json_data_node->* TO <node_data>.
LOOP AT <node_data> ASSIGNING <node_element>.
IF <node_element> IS ASSIGNED.
ASSIGN <node_element>->* TO <element_data>.

PERFORM extract_from_json USING 'EDITSTATUS' CHANGING json_parsed_data-editstatus.
PERFORM extract_from_json USING 'HTTPCODE' CHANGING json_parsed_data-httpcode.
PERFORM extract_from_json USING 'INDEX' CHANGING json_parsed_data-index.
PERFORM extract_from_json USING 'INLINERESULTS' CHANGING json_parsed_data-inlineresults.
PERFORM extract_from_json USING 'KEY' CHANGING json_parsed_data-key.
PERFORM extract_from_json USING 'MESSAGE' CHANGING json_parsed_data-message.
PERFORM extract_from_json USING 'STATUS' CHANGING json_parsed_data-status.


Data Validation:

  • Checks are made to see if the status received in the response is 'OK'.

  • Depending on the status, the corresponding data records are updated with success or failure messages.

    if json_parsed_data-status = 'OK'.
wa_dg_rptdata_s-status = 'Success'.
modify it_dg_rptdata_d_summary from wa_dg_rptdata_d_summary index sum_tabix.
else. " refine error handling as per your requirement
wa_dg_rptdata_s-status = 'Bonus updated failed'.
modify it_dg_rptdata_d_summary from wa_dg_rptdata_d_summary index sum_tabix.

Utility Methods:

  1. extract_from_json: This method extracts data from the JSON response based on the field name provided. It uses the ASSIGN_COMPONENT to dynamically access fields in the structure.

  2. convert_date: This method calculates the difference in days between the provided date and a base epoch date (c_epoch). The result is converted into milliseconds format to be used in OData requests.
    FORM extract_from_json USING field_name TYPE string CHANGING target_value.
    DATA: field_ref TYPE REF TO data,
    val_ref TYPE REF TO data.

    ASSIGN COMPONENT field_name OF STRUCTURE <element_data> TO <field_ref>.
    IF <field_ref> IS ASSIGNED.
    ASSIGN <field_ref>->* TO <val_ref>.
    IF <val_ref> IS ASSIGNED.
    target_value = <val_ref>.
    UNASSIGN: <field_ref>.
    CLEAR: <val_ref>.
    form convert_date using p_date changing c_date_str.

    * Calculate the number of days between the ABAP date and the epoch
    lv_seconds_since_epoch = ( p_date - c_epoch ) * c_seconds_in_a_day.

    * Convert to milliseconds
    lv_date_in_milliseconds = lv_seconds_since_epoch * c_milliseconds_in_a_second.
    str_date = lv_date_in_milliseconds.
    * Format the result
    concatenate '/Date(' lv_date_in_milliseconds ')/' into lv_result_string.

    c_date_str = lv_result_string.


To sum it up, our exploration today provided a step-by-step guide on using ABAP to create an upsert request, send it, and manage the incoming response. This is a testament to how ABAP can effectively work with OData services, bridging data between EC and ECP.

I encourage you to dive in, experiment, and apply this to those unique challenges you may encounter in your EC-ECP projects. Reflecting on my journey, a couple of years ago, a client was transmitting time data to ECP using PERNR, but lacked the corresponding EC ID. The goal? To have ECP push the Badge ID details back to EC for newly hired Employee. It was then that I turned to the OData API upsert, and it was nothing short of a lifesaver.

Challenges often lead to discovery. Embrace them, innovate, and remember: every challenge you tackle prepares you for even greater achievements.
1 Comment