Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
BJarkowski
Active Contributor
6,803
One of the most common documents each company has to deal with are invoices coming from various suppliers. Whenever you buy products or services, you receive a document that lists your partner address data, together with line items, total value, and tax calculation. Each invoice has to be registered in your finance system, which can be a time-consuming task. At the same time, it's not a challenging activity - you basically copy information from the source document. To automate the process, many companies already exchange invoices in a digital format. But such an approach is not always possible. So, despite living in 2021, and landing on the moon over 50 years ago, a piece of paper is still an important information carrier.

Following the SmartSheet report “Automation in the Workplace” over 40% percent of workers surveyed spend at least a quarter of their work week on repetitive tasks. Automating manual data collection and entry is one of the most desired productivity boosters highlighted by 55% of employees. In this post I will show you how Azure can help you with such challenges by automating entering invoices to the SAP system.

Optical Character Recognition (OCR) is a mature technology that can accurately convert scanned text into digital format. But when it's supported by Artificial Intelligence, it provides more advanced functionality. Not only focusing on digitalizing data but also analysing the content, allowing you to extract the business context of the document. This way, you can identify each part of the invoice, as the supplier address data or total value.


(source: Microsoft.com)


I believe that the surrounding technologies available in Azure are the game-changer when it comes to choosing the right cloud platform for hosting SAP. While I love infrastructure topics, and I enjoy every SAP install I do, I have much better fun when I have an opportunity to play around with the application and integration area. So far, I've written many blogs about data, focusing on how to access information stored in SAP systems, and use them across Azure portfolio.  But I always felt that I didn't show enough love to Logic Apps. Please don’t get me wrong - I do think it is an excellent service that simplifies integration with the SAP system. I have already blogged about it almost two years ago, describing how to use it in B2B communication. I planned to touch on Logic Apps again a couple times before, but I wanted to avoid a simple scenario of calling a BAPI. Finally, inspired by one of the customers, I found an interesting topic I would like to share with you. It also allows me to describe Azure Cognitive services, which I haven’t yet had a chance doing in this blog series, so please let me give you a quick overview.

Working with Artificial Intelligence and Machine Learning algorithms can be difficult. It requires a good level of statistics knowledge, supported by overall math skills. To make the technology available to everyone, Microsoft prepared a set of pre-built services specialized in certain areas, like text analytics (sentiment detection or key phrase extraction), decision-making processes (anomaly detector), or vision recognition. I suppose you have heard about them already - it's quite a hot topic nowadays. I also recommend checking out the cool application that can analyze a photo and provide age estimation.

Form Recognizer is part of Azure Cognitive Services that allows you to digitalize analog documents. It uses machine learning to automate the extraction of key information. You can train it to work with pretty much any document type, but the new version of the service comes with already trained models capable of working with receipts, business cards, and invoices. Automating invoice processing was never easier - we basically need to build a workflow that connects all dots together. Let's see how to do it!

WORKFLOW STRUCTURE

Before we start building the required resources, I would like to spend a moment to explain how the workflow will work. I use Outlook trigger to continuously monitor the mailbox and start the processing when a new e-mail with an attachment arrives. Most scanners allow you to automatically send documents to an e-mail, so this seems to be a reasonable approach. Then, I save the document in a data lake and send it to the Form Recognizer service. To post results, I use the SAP connector available in Logic Apps to trigger a function module. Sounds easy!

CREATE FORM RECOGNIZER

The Form Recognizer is available on Azure Marketplace. Click the Create button and then provide basic information like the resource group and service name.


Microsoft prepared three pricing tiers. The free one allows you to process 500 pages per month for free. It should be enough for testing.

Once the Form Recognizer is deployed, you can start playing around with the service. You can give it a try by using a simple UI tool that you can connect to your subscription and process a couple of invoices. It displays all discovered information so you can ensure and marks its location on the source document. You can use it with one of the sample document shared in the documentation, but I think it's much better fun if you try it with documents you plan to actually use. This way, you build your confidence in the tool, and you can discover any issues early. Below you can see the processing results of my test invoice.


There are three endpoints available based on the document type. To process an invoice, you have to send a POST request to:
https://<FormRecognizerServiceEndpoint>/formrecognizer/v2.1-preview.2/prebuilt/invoice/analyze


The service requires authentication. In Postman, go to the Authorization tab and set the TYPE to API Key. The Key field should have a static text "Ocp-Apim-Subscription-Key" and the Value is the actual service key, which you can read from the Cognitive Services blade.


I use the same invoice to run the test. In Postman, go to the Body tab and set the request type to binary. Then select the file containing the document and click Send. The response comes with the 202 Accepted status code but without a body. Form Recognizer uses asynchronous processing, so to read the analyzed content, you have to send another request to the URL specified in the Operation-Location header.


Copy the URL and send another request to the endpoint specified in the response header. This time, we receive a JSON document containing information from the source invoice.


There are two main parts of the JSON file: pageResults and documentResults. The first one returns information about each text element discovered in the invoice but without any identification of the business-related fields. A much more interesting is the second part, where Form Recognizer applies business context. It contains meaningful objects - like BillingAddress or InvoiceDate. Each entity consists of multiple properties describing the data type (a number, string, or date), confidence level, and the bounding box from the source document.

This short introduction to the Form Recognizer should give you enough information to start working with the service in Logic Apps. Let's build the workflow!

WORKFLOW

I start with an empty Logic App, and I use the Outlook connector to trigger the workflow whenever a new e-mail with an attachment arrives. As there may be more than one attachment, I use the For Each loop to iterate through them.


There are two ways how you can send documents to the Form Recognizer. You can either send it in the message body, as we did in the previous exercise, but you can also reference it by providing the URL pointing to the file. I think it's always a good practice to archive processed documents, so instead of sending files over the internet, I use the blob storage functionality to share the file using Shared Access Signature. It allows access to the file for the service without a need of changing the access tier to public.

This is a two-step approach. Firstly, I use Create Blob action to save the attachment in the cloud storage. I use guid() expression to ensure each file name is unique. This way, I avoid potential overwriting of files processed earlier. The second action is HTTP Request that sends the URL of the file to the Form Recognizer. We need to provide an access key to the Azure service as we did in the Postman test, which I include in the request header. To allow read access to the exposed file, I generate a Shared Access Signature and concatenate it with the URL.
{
"source": "<file_url>@{body('Create_blob')?['Path']}?<SAS_Key>"
}


Let's run a quick test to see if the Form Recognizer can access the file. I send my test invoice and wait for the workflow to start.


So far, everything looks correct. As the processing should only continue when the document is accepted by the service, I add a Condition that checks the status code of the request.


From now on, for the purpose of the blog, I will only describe the flow when the condition is fulfilled. If you plan to use the service in your environment, don't forget to add error-handling. For example, you can send an e-mail if processing failed. I'm sure you know how to do it already.

I use the Delay operator to wait ten seconds and give the Form Recognizer a bit of time to process the document. Then I pass the URL from the Operation-Location header to another HTTP request to read the results. The designer doesn't give you an option to access a specific key so you have to switch to code-view.
"uri": "@{outputs('HTTP')['headers']['Operation-Location']}",


The third action I use is the Parse JSON to trim the results. I’m only interested in the documentResults part so I can ignore the rest. Similarly as with selecting specific header I switch to code view to reference only the documentResults part.


GETTING ALL DATA

Unfortunately, not all data required to post an invoice in the SAP system are included in the document, and they need manual input. Initially, I thought about creating a two-step process, firstly importing data to an external database and asking users to verify the document and provide missing values like a G/L Account and Cost Center. But that involves creating additional resources, together with a user interface, which basically makes the whole solution much complex. Instead, I decided to Park the document in the system, which seems to be a common approach. As the results may not be 100% accurate, you want to have an option to change the entry. It also removes the requirement for additional resources.

But it caused another challenge. As I'm using the SAP S/4HANA system, I wanted to take advantage of the available APIs. I thought I could use the API_SUPPLIERINVOICE_PROCESS_SRV OData service, which is just designed for such purposes. But half-way through my tests, I noticed that it doesn't support parking documents. So I ended up with the commonly used function module BAPI_ACC_DOCUMENT_POST, which is actually a great opportunity to show you the native Logic App SAP connector capabilities. In addition I will call BAPI_BUPA_SEARCH to read supplier number based on the name and address information contained on the invoice.

USING RFC CONNECTOR

Logic Apps supports communication with SAP system over RFC for quite a while, and currently, you can use a second generation of the connectors. They offer a wide functionality – you can use them to send and receive IDOCs, call remote function modules, or even directly access table entires. As SAP systems usually are not exposed to the internet, to facilitate the connection with Logic Apps, you have to deploy a small component called On-Premises Data Gateway. It works similarly to the SAP Cloud Connector. There is a step-by-step installation guide available, so I'll skip the process here.

Once the data gateway is available, you can start using the connectors. For test purposes, I suggest deploying a separate workflow where you can test basic operations and connectivity to your SAP system. Firstly you have to provide basic settings about your system, including hostname, system number, and the communication port. You can choose to connect directly to the application server or use message server load balancing. I'm taking the easiest route, and I use basic authentication, which requires me to provide my credentials, but you can also take advantage of the Single Sign-On mechanism.


Now it’s the tricky moment. I know you'd like to jump straight to calling a function module, but RFC communication is not so easy. It doesn't support JSON requests, so instead, you have to rely on the XML format. As the SAP system requires strict formatting you also should fully understand the schema of the document. All OData services are documented in the SAP API Business Hub, but how to retrieve the structure of the BAPI call? There is an additional connector designed for it. Function modules differ between SAP system releases so the easiest option to get the schema that applies to your version is to connect and read it directly from the system. There is no need to read it every time you run the workflow, so I usually create a temporary Logic App just for this purpose.

The outcome of the Generate Schema action is one or multiple XSD files that describe the message structure. You can save them in the blob storage and then download them to your local computer. In this project, I use two BAPIs, so I execute the workflow twice, changing the function module in between.


Logic Apps provides only essential capabilities for working with XML structures, so there is no fancy editor, that would simplify creating a sample message based on the provided schema. Instead, you have to use specialized tools - I find Visual Studio quite good at this job, but the choice is very broad. After opening the schema file, I can view the structure in the explorer and generate a sample message.


I understand your concerns. It seems there are lots of preparation works required to use the Logic App SAP connector, but to be honest, from the process view it's not much different from working with OData services. The whole XML structure is basically a copy of what you can see in SE37, so it’s quite easy to run a couple of tests before writing the target file. One tip I can share is to always check the field order, which is imposed by the schema.

If you ever encounter a challenge using Azure Logic App connector, please have a look at the Azure TechCommunity blogs around integration. There's plenty of articles that can help you!

POSTING THE INVOICE

We left our workflow on parsing the JSON structure and extracting the documentResults object. We just need a couple of actions to complete the solution.

A single file sent to the Form Recognizer may contain more than one document, so the documentResult object is actually an array of processed invoices. We need to add another For Each loop that will iterate through the results. Instead of working with nested objects inside the documentResults, I use another Parse JSON action to create a simple structure with all details I need to pass to the BAPI. This step is not mandatory, but I think it improves the clarity.



"inputs": {
"AmountDue": "@items('For_each_2')?['fields']?['AmountDue']?['valueNumber']",
"BillingAddress": "@items('For_each_2')?['fields']?['BillingAddress']?['valueString']",
"BillingAddressRecipient": "@items('For_each_2')?['fields']?['BillingAddressRecipient']?['valueString']",
"CustomerName": "@items('For_each_2')?['fields']?['CustomerName']?['valueString']",
"DueDate": "@items('For_each_2')?['fields']?['DueDate']?['valueDate']",
"InvoiceDate": "@items('For_each_2')?['fields']?['InvoiceDate']?['valueDate']",
"InvoiceId": "@items('For_each_2')?['fields']?['InvoiceId']?['valueString']",
"InvoiceTotal": "@items('For_each_2')?['fields']?['InvoiceTotal']?['valueNumber']",
"SubTotal": "@items('For_each_2')?['fields']?['SubTotal']?['valueNumber']",
"TotalTax": "@items('For_each_2')?['fields']?['TotalTax']?['valueNumber']",
"VendorAddress": "@items('For_each_2')?['fields']?['VendorAddress']?['valueString']",
"VendorAddressRecipient": "@items('For_each_2')?['fields']?['VendorAddressRecipient']?['valueString']",
"VendorName": "@items('For_each_2')?['fields']?['VendorName']?['valueString']"
},

Once we have all information in an easy-to-access structure, you can add the first Call BAPI action to read the supplier number. I paste the prepared XML structure in the Input BAPI Parameters field.



<?xml version="1.0" encoding="utf-8"?>
<SEARCH xmlns="http://Microsoft.LobServices.Sap/2007/03/Bapi/BUS1006/">
<CENTRALDATA xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<MC_NAME1>@{outputs('Invoice')['VendorName']}</MC_NAME1>
</CENTRALDATA>
</SEARCH>

Finally, in the last step of the workflow, I park the invoice using the BAPI_ACC_DOCUMENT_POST function module. I hardcoded some of the static fields, like the username or company code. For the rest, I use the results coming from the Form Recognizer and previous action.



<?xml version="1.0" encoding="utf-8"?>
<POST xmlns="http://Microsoft.LobServices.Sap/2007/03/Bapi/BUS6035/">
<DOCUMENTHEADER xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<BUS_ACT>RFBV</BUS_ACT>
<USERNAME>BJARKOWSKI</USERNAME>
<COMP_CODE>1110</COMP_CODE>
<DOC_DATE>@{body('Invoice')?['InvoiceDate']}</DOC_DATE>
<PSTNG_DATE>@{body('Invoice')?['InvoiceDate']}</PSTNG_DATE>
<DOC_TYPE>KR</DOC_TYPE>
<REF_DOC_NO>@{body('Invoice')?['InvoiceId']}</REF_DOC_NO>
<DOC_STATUS>2</DOC_STATUS>
</DOCUMENTHEADER>
<ACCOUNTPAYABLE xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<BAPIACAP09>
<ITEMNO_ACC>0010</ITEMNO_ACC>
<VENDOR_NO>@{body('Call_BAPI_-_Get_supplier_number')?['JsonResponse']['SEARCHRESULT'][0]['PARTNER']}</VENDOR_NO>
</BAPIACAP09>
</ACCOUNTPAYABLE>
<CURRENCYAMOUNT xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<BAPIACCR09>
<ITEMNO_ACC>0010</ITEMNO_ACC>
<CURRENCY>GBP</CURRENCY>
<AMT_DOCCUR>@{body('Invoice')?['AmountDue']}</AMT_DOCCUR>
<TAX_AMT>@{body('Invoice')?['TotalTax']}</TAX_AMT>
</BAPIACCR09>
</CURRENCYAMOUNT>
</POST>

Let’s test the whole workflow! I send an invoice to the e-mail address provided in the trigger. It was immediately picked up by Logic Apps, and it looks like the processing is successful.


Devil’s in the details, so let’s take a closer look at the Call BAPI actions. Both were executed successfully. The first one returned the supplier number, which we then passed to the second call, together with Form Recognizer results.


The message confirms the document is successfully parked with number 1900000023. Let’s log in to the SAP system and see if we can find it in transaction FB03.


No surprises!

Azure provides an excellent set of services that you can use with your SAP system. I hope this blog inspire you to build your own solutions that use Azure Cognitive. While we focused on an easy example to automate invoice processing, you can train the Form Recognizer to work with other document types you use in your company – like orders or timesheets. Of course, the above solution is not 100% complete - if you think about implementing it in your SAP system remember to add some error handling and invoice traceability to your project. It’s not difficult but can save you a lot of time in case of issues.
3 Comments
Labels in this area