cancel
Showing results for 
Search instead for 
Did you mean: 

Datahub 6.0 - ECC material master load is throwing exception

Former Member
0 Kudos

Team,

I have setup hybris 6.0with data hub 6.0 for ECC integration and master dataload but its throwing NullPointerException. I have setup datahub with hybris 6.0 datahub-webapp.war(hybris\bin\ext-integration\datahub\web-app) and added jar files from SAP extension (hybris\bin\ext-integration\datahub\extensions\sap)

java.lang.NullPointerException at com.hybris.datahub.saperpproduct.publication.ProductVariantPublicationHandler.isApplicable(ProductVariantPublicationHandler.java:74)

Please find the data hub log given below

2016-05-20 12:25:18,605 [DEBUG] [c.h.d.s.i.PublicationActionHandler] Creating Target Items for type: 'SalesVariant', and Target System: HybrisCore 2016-05-20 12:25:18,620 [DEBUG] [c.h.d.s.i.DefaultTargetItemService] Creating Target Items for type SalesVariant from 1 Canonical Items in Pool GLOBAL [INFO] [05/20/2016 12:25:18.620] [DataHubActorSystem-akka.actor.default-dispatcher-19] [akka://DataHubActorSystem/user/$h/target-type-items-creator2] Creating target it ems for type SalesVariant and 1 canonical items in Pool GLOBAL [INFO] [05/20/2016 12:25:18.620] [DataHubActorSystem-akka.actor.default-dispatcher-20] [akka://DataHubActorSystem/user/$h/target-type-items-creator3] Creating target it ems for type SalesProduct and 1 canonical items in Pool GLOBAL 2016-05-20 12:25:18,620 [DEBUG] [c.h.d.s.i.DefaultTargetItemService] Creating Target Items for type SalesProduct from 1 Canonical Items in Pool GLOBAL [INFO] [05/20/2016 12:25:18.636] [DataHubActorSystem-akka.actor.default-dispatcher-19] [akka://DataHubActorSystem/user/$h] Creation of target items failed. Notifying p arent 2016-05-20 12:25:18,636 [ERROR] [c.h.d.s.i.AkkaEnabledPublicationActionHandler] Failed to create Target Items page [ERROR] [05/20/2016 12:25:18.636] [DataHubActorSystem-akka.actor.default-dispatcher-19] [akka://DataHubActorSystem/user/$h/target-type-items-creator2] null java.lang.NullPointerException at com.hybris.datahub.saperpproduct.publication.ProductVariantPublicationHandler.isApplicable(ProductVariantPublicationHandler.java:74) at sun.reflect.GeneratedMethodAccessor332.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:317) at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157) at org.springframework.aop.interceptor.ExposeInvocationInterceptor.invoke(ExposeInvocationInterceptor.java:92) at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:207) at com.sun.proxy.$Proxy136.isApplicable(Unknown Source) at com.hybris.datahub.grouping.impl.PublicationGroupingChainRunnerStrategy.lambda$applyGroupings$140(PublicationGroupingChainRunnerStrategy.java:52) at com.hybris.datahub.grouping.impl.PublicationGroupingChainRunnerStrategy$$Lambda$237/34284107.apply(Unknown Source) at java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:267) at java.util.stream.Streams$StreamBuilderImpl.forEachRemaining(Streams.java:419) at java.util.stream.ReferencePipeline$Head.forEach(ReferencePipeline.java:580) at java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:270) at java.util.stream.Streams$StreamBuilderImpl.forEachRemaining(Streams.java:419) at java.util.stream.ReferencePipeline$Head.forEach(ReferencePipeline.java:580) at java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:270) at java.util.stream.Streams$StreamBuilderImpl.forEachRemaining(Streams.java:419) at java.util.stream.ReferencePipeline$Head.forEach(ReferencePipeline.java:580) at java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:270) at java.util.stream.Streams$StreamBuilderImpl.forEachRemaining(Streams.java:419) at java.util.stream.ReferencePipeline$Head.forEach(ReferencePipeline.java:580) at java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:270) at java.util.stream.Streams$StreamBuilderImpl.forEachRemaining(Streams.java:419) at java.util.stream.ReferencePipeline$Head.forEach(ReferencePipeline.java:580) at java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:270)

Accepted Solutions (0)

Answers (4)

Answers (4)

Former Member
0 Kudos

Yes, Please remove sapproductconfiguration-6.0.0.0-RC6.jar from datahub webapp

0 Kudos

I'm getting the same error. Have you solved it?

Former Member
0 Kudos

I have removed the datahub extension datahub-cleanup-6.0.0.0-RC13 but getting below error on master data load from ECC. So I added it back. Please let me know if you are mentioning same extension ? I have used only out of box SAP d datahub-wevbapp.war from hybris 6.0. So I am not sure its an issue on latest version or I missed something

[ERROR] [com.hybris.datahub.service.impl.PublicationActionHandler] Error publishing action: 1 com.hybris.datahub.publication.TargetItemCreationException: Failed to create Target Items for Publication at com.hybris.datahub.service.impl.AkkaEnabledPublicationActionHandler.createPublicationTargetItems(AkkaEnabledPublicationActionHandler.java:78) ~[datahub-service-akka-6.0.0.0-RC12.jar:6.0.0.0-RC12] at com.hybris.datahub.service.impl.PublicationActionHandler.createOnePageOfItemsToPublish(PublicationActionHandler.java:393) [datahub-service-6.0.0.0-RC12.jar:6.0.0.0-RC12] at com.hybris.datahub.service.impl.PublicationActionHandler.createItemsToPublishByCanonicalType(PublicationActionHandler.java:278) [datahub-service-6.0.0.0-RC12.jar:6.0.0.0-RC12] at com.hybris.datahub.service.impl.PublicationActionHandler.lambda$createTargetItemsForPublications$64

former_member224482
Active Contributor
0 Kudos

I updated my answer.

former_member224482
Active Contributor
0 Kudos

Remove the cleanup extension.

A MATMAS will create CanonicalProduct(CanonicalBaseProduct in 5.7 and older) and CanonicalProductSales(CanonicalSalesProduct in 5.7 and older) for each sales area sent.
Each CanonicalProduct will create a product in the Default:staged catalog and CanonicalProductSales in the mapped catalog.
When sending extra CanonicalProductSales which are not mapped to a catalog, the data hub assign an error to the CanonicalProductSales and try to send it again and again in the next publications.
If a CanonicalProductSales exists, it is assumed that the CanonicalProduct with the same product ID also exists in the data hub.
The cleanup extension when incorrectly configured will delete all canonical which have been published. Therefore the assumed CanonicalProduct with the same product ID will be deleted.

The issue you are facing is caused by : 1. sending extra sales area information which has not been mapped to a catalog. 2. incorrectly configured the cleanup extension which created data inconsistency within data hub.

Former Member
0 Kudos

Did it solve the issue ??? I have same issue. Let us know please. Thanks