Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
Matthias_BW
Explorer
1,105

 

With the economic upswing and increasing globalization, quality costs became more and more of a focus for companies due to increasing product complexity and growing customer expectations. In this context, the method for determining and analyzing total quality costs - known as Total Cost of Quality (TCOQ) - was developed. This includes preventive measures, assessments and the costs of internal and external errors. Precise management of these key figures is essential for your company in order to optimize costs and ensure customer satisfaction. 

To efficiently analyze the diverse data sources and complex relationships behind TCOQ, a powerful data platform is essential. SAP Datasphere enables your company to seamlessly integrate your data across different systems and put it into a business context. SAP Datasphere not only offers advanced data modelling and harmonization capabilities, but also the ability to combine data from SAP and non-SAP systems through a “business data fabric” approach. 

SAP Datasphere also supports you in managing the total cost of quality by efficiently combining data from various sources such as production reports, audit documents or customer feedback. Users benefit from a semantically enriched data layer that retains business correlations and forms a basis for precise analyses. With the help of these functions, your company can, for example, better monitor preventive and evaluation costs and reduce error costs in a targeted manner. 

With SAP Datasphere and integrated tools such as SAP Analytics Cloud, you can also create specific dashboards that visualize quality costs in real time and enable informed decisions. The platform provides a solid basis for meeting the challenges of modern quality control and focusing on sustainable quality improvements. 

Challenges in evaluating the Total Cost of Quality:  
Example scenario production company with subsidiaries 

Matthias_BW_0-1738675248596.png
Graphic 1: Example scenario of a production company with subsidiaries 

Creating a consolidated evaluation of the total cost of quality poses practical challenges for many companies. Data from different sources and in varying formats often has to be collected, processed and analyzed. This not only makes it difficult to ensure that the results are consistent and up-to-date, but also ties up valuable controlling resources. Our graphic shows a typical scenario of this problem (see above). 

Scrapping data via material movement 

Scrapping data is recorded in the company using material movements and is available centrally in the SAP BW system. Controlling can retrieve it there using Analysis for Office. However, before releasing the analysis, production management requires a preliminary check in which the file is first checked internally and then stored in the network drive. This additional work step not only leads to delays, but also increases the susceptibility to errors. 

Costs from quality management measures 

The costs from QM measures can be viewed comparatively easily via a BW query that can be called up regularly. This structure makes the work of controlling easier, as the data is directly available without additional processing. However, integration into an overall evaluation of the TCOQ still requires further steps to ensure a uniform presentation. 

Quality costs of the subsidiaries 

The quality costs of the subsidiaries pose a particular challenge. These are provided as a CSV file, which is a by-product of group consolidation. The file is sent by e-mail to Controlling, which then has to manually assign the data to the individual cost types. This process is not only time-consuming, but also entails a considerable risk of data inconsistencies. 

Matthias_BW_1-1738675392748.png
Graphic 2: Solution scenario at a glance 

 

Our example scenario illustrates how important an integrated data platform such as SAP Datasphere can be for automating processes and harmonizing the data landscape. A standardized, centralized view of all your relevant quality costs could save you time as well as increase the accuracy and transparency of your analyses.  

SAP Datasphere enables specialist departments to use comprehensive self-service functions via a graphical modeling interface. The space concept creates a clear structure: it encapsulates data pools while ensuring auditable governance from an IT perspective. This allows business users to independently control data processes and create analyses without having to rely on IT resources. 

Examples of self-service functions: 

  • Adding new upload files  
  • Control and automation of data processes 
  • Creation of mappings and derivations 
  • Creation of analytical models Space concept 
  • Implementation and maintenance of own master data 
  • Documentation of processes and data structures 
  • Preview and analysis of data and its origin (preview/lineage) 
  • Access to public and licensed data via a marketplace 

Independent sharing of data products 

 

With these functions, SAP Datasphere promotes efficient and autonomous use of your data within the specialist departments, with varying degrees of complexity.   

Space concept:

Matthias_BW_2-1738676145827.png
Graphic 3: Space concept 

The overarching space concept of SAP Datasphere creates a clear structure for organizing collaboration between different departments and IT. It enables efficient coordination in the development and use of data products. 

IT can provide central connections and data models as a data service, ensuring that business departments can access standardized and verified data sources. This facilitates integration and improves the consistency of data analyses. 

Consistent basic modeling ensures integration, data quality and data security. This ensures that your data meets the high governance and compliance requirements. 

Spaces for specialist departments remain open for the use of central and local information. This allows business users to flexibly build and expand analytical models while benefiting from central data resources. 

Data model in the SAP Datasphere for example scenario: 

Matthias_BW_1-1738676101042.png
Graphic 4: Data model in SAP Datasphere for example scenario 

The solution offers several advantages that cover various aspects of data processing and utilisation: A key advantage is the ability to process scrapping via material movements via the data source for BW/4 directly into a Datasphere space, which offers additional flexibility. Alternatively, the BW/4 data model could also have been used as a source. Production management now has the option of transferring released data to a table that is specifically intended for access by Controlling. 

Another advantage lies in the handling of costs from quality management measures. This data can be transferred directly to the central data pool for the total cost of quality, enabling a more efficient and centralised cost analysis. 

In addition, the handling of the subsidiaries' quality costs has been optimised. CSV files can be uploaded directly to the Datasphere by the subsidiaries. The source data sent is automatically historicised. The mapping to the cost types for the total cost of quality can be modelled directly in the Datasphere by the specialist departments, which simplifies the process considerably. 

A particular advantage of the solution is that it is no longer necessary to merge the data manually. The release of individual source data pools can be specifically controlled by the departments. In addition, the entire business logic of the mapping is stored centrally in a single system, which makes administration and traceability much easier. 

The project procedure can be better parallelised: 

 

Matthias_BW_0-1738676029939.png
Graphic 5: The project procedure can be better parallelised 

The project procedure is divided into several phases that can be parallelised in order to increase efficiency. Firstly, the target characteristics, key figures and cost types are defined and a table is implemented in the target space. 

At the same time, an analytical model can be created based on a CSV upload in the target space, which is then used for reporting in the SAP Analytics Cloud. At the same time, the creation of interfaces to the source spaces can be started in order to implement the data supply for each space. These parallel work streams enable the project to be implemented more quickly and efficiently. 

SAP Datasphere offers you powerful modelling, seamless integration and flexible usage options for SAP data, even in new and diverse combinations. With SAP Datasphere, you can dissolve your data silos on network drives and in e-mail inboxes and create a standardised database. 

The platform stands out particularly for specialist departments thanks to its extended self-service functions, which have been significantly improved compared to SAP BW/4HANA. The self-service options are scalable and enable flexible customisation to your respective requirements. In addition, the project approach presented allows the parallel implementation of complex KPI systems, which increases efficiency during implementation. 

SAP Datasphere is SAP's strategic platform for modern data warehousing. In the future, the platform will be continuously expanded with optimised self-service functions and extended data integration options, such as the connection to Google Cloud Storage. The principle presented here can also be easily applied to other KPI systems, for example in the area of ESG (Environmental, Social, Governance) or GSRD (Global Supplier Responsibility Data). 

A central component of future developments remains the structured elaboration of individual data pools and the exchange of information on their utilisation in order to enable your company to implement a holistic data strategy. SAP Datasphere therefore not only supports existing requirements, but also provides a strong foundation for future data projects. 

 

 

 

 

 

1 Comment
Labels in this area