cancel
Showing results for 
Search instead for 
Did you mean: 
Read only

Datahub Customer Load is not working!

Former Member
0 Likes
356

After enabling clustering, the customer load seems to be failing with the below given error:

 2018-08-03T08:00:59.957+0200 [INFO] [c.h.d.s.i.AbstractTargetItemService] Creating Target Items for type: 'CanonicalPartySales', and Target System: HybrisCore
 2018-08-03T08:00:59.968+0200 [ERROR] [c.h.d.s.i.PublicationActionHandler] Error publishing action: 201
 java.lang.IllegalStateException: Optional.get() cannot be called on an absent value
     at com.google.common.base.Absent.get(Absent.java:47) ~[guava-17.0.jar:na]
     at com.hybris.datahub.repository.converter.CanonicalItemConverter.toCanonicalItem(CanonicalItemConverter.java:64) ~[datahub-service-6.6.0.0-RC4.jar:6.6.0.0-RC4]
     at com.hybris.datahub.repository.jpa.impl.DefaultCanonicalItemJpaRepository.convertQueryResults(DefaultCanonicalItemJpaRepository.java:302) ~[datahub-service-6.6.0.0-RC4.jar:6.6.0.0-RC4]
     at com.hybris.datahub.repository.jpa.impl.DefaultCanonicalItemJpaRepository.findComposedItems(DefaultCanonicalItemJpaRepository.java:241) ~[datahub-service-6.6.0.0-RC4.jar:6.6.0.0-RC4]
     at sun.reflect.GeneratedMethodAccessor445.invoke(Unknown Source) ~[na:na]
     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_151]
     at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_151]
     at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333) ~[spring-aop-4.3.11.RELEASE.jar:4.3.11.RELEASE]
     at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:207) ~[spring-aop-4.3.11.RELEASE.jar:4.3.11.RELEASE]
     at com.sun.proxy.$Proxy107.findComposedItems(Unknown Source) ~[na:na]
     at com.hybris.datahub.repository.delegating.DelegatingCanonicalItemRepository.findComposedItems(DelegatingCanonicalItemRepository.java:99) ~[datahub-in-memory-6.6.0.0-RC4.jar:6.6.0.0-RC4]
     at com.hybris.datahub.pooling.PagedPublicationWorkingSet.findComposedCanonicalItems(PagedPublicationWorkingSet.java:147) ~[datahub-service-6.6.0.0-RC4.jar:6.6.0.0-RC4]
     at com.hybris.datahub.pooling.PagedPublicationWorkingSet.retrievePageData(PagedPublicationWorkingSet.java:105) ~[datahub-service-6.6.0.0-RC4.jar:6.6.0.0-RC4]
     at com.hybris.datahub.pooling.PagedPublicationWorkingSet.getNextCanonicalItemPage(PagedPublicationWorkingSet.java:84) ~[datahub-service-6.6.0.0-RC4.jar:6.6.0.0-RC4]
     at com.hybris.datahub.service.impl.AbstractTargetItemService.getLatestCanonicalItems(AbstractTargetItemService.java:225) ~[datahub-service-6.6.0.0-RC4.jar:6.6.0.0-RC4]
     at com.hybris.datahub.service.impl.AbstractTargetItemService.createTargetItemsForWorkingSet(AbstractTargetItemService.java:133) ~[datahub-service-6.6.0.0-RC4.jar:6.6.0.0-RC4]
     at com.hybris.datahub.service.impl.AbstractTargetItemService.lambda$createTargetItemsForCanonicalType$3(AbstractTargetItemService.java:115) ~[datahub-service-6.6.0.0-RC4.jar:6.6.0.0-RC4]
     at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) ~[na:1.8.0_151]
     at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1380) ~[na:1.8.0_151]
     at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) ~[na:1.8.0_151]
     at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471) ~[na:1.8.0_151]
     at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[na:1.8.0_151]
     at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[na:1.8.0_151]
     at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499) ~[na:1.8.0_151]
     at com.hybris.datahub.service.impl.AbstractTargetItemService.createTargetItemsForCanonicalType(AbstractTargetItemService.java:116) ~[datahub-service-6.6.0.0-RC4.jar:6.6.0.0-RC4]
     at com.hybris.datahub.service.impl.AbstractTargetItemService.lambda$createTargetItemsForPublication$1(AbstractTargetItemService.java:84) ~[datahub-service-6.6.0.0-RC4.jar:6.6.0.0-RC4]
     at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) ~[na:1.8.0_151]
     at java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:175) ~[na:1.8.0_151]
     at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1380) ~[na:1.8.0_151]
     at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481) ~[na:1.8.0_151]
     at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471) ~[na:1.8.0_151]
     at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[na:1.8.0_151]
     at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[na:1.8.0_151]
     at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499) ~[na:1.8.0_151]
     at com.hybris.datahub.service.impl.AbstractTargetItemService.createTargetItemsForPublication(AbstractTargetItemService.java:85) ~[datahub-service-6.6.0.0-RC4.jar:6.6.0.0-RC4]
     at com.hybris.datahub.service.impl.PublicationActionHandler.lambda$createTargetItems$0(PublicationActionHandler.java:148) [datahub-service-6.6.0.0-RC4.jar:6.6.0.0-RC4]
     at java.util.ArrayList.forEach(ArrayList.java:1255) ~[na:1.8.0_151]
     at com.hybris.datahub.service.impl.PublicationActionHandler.createTargetItems(PublicationActionHandler.java:145) [datahub-service-6.6.0.0-RC4.jar:6.6.0.0-RC4]
     at com.hybris.datahub.service.impl.PublicationActionHandler.handlePublicationAction(PublicationActionHandler.java:118) [datahub-service-6.6.0.0-RC4.jar:6.6.0.0-RC4]
     at com.hybris.datahub.service.impl.PublicationActionHandler.handleAction(PublicationActionHandler.java:95) [datahub-service-6.6.0.0-RC4.jar:6.6.0.0-RC4]
     at com.hybris.datahub.service.impl.PublicationActionHandler.handleAction(PublicationActionHandler.java:69) [datahub-service-6.6.0.0-RC4.jar:6.6.0.0-RC4]
     at com.hybris.datahub.command.impl.AbstractPerformCommand.lambda$execute$0(AbstractPerformCommand.java:56) [datahub-service-6.6.0.0-RC4.jar:6.6.0.0-RC4]
     at java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1626) ~[na:1.8.0_151]
     at com.hybris.datahub.service.ExceptionHandlingAsyncTaskExecutor$2.run(ExceptionHandlingAsyncTaskExecutor.java:79) ~[datahub-service-6.6.0.0-RC4.jar:6.6.0.0-RC4]
     at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[na:1.8.0_151]
     at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[na:1.8.0_151]
     at java.lang.Thread.run(Thread.java:748) ~[na:1.8.0_151]
 2018-08-03T08:00:59.968+0200 [INFO] [c.h.d.s.i.PublicationActionHandler] Failing publication action: 201
 
 

Accepted Solutions (0)

Answers (1)

Answers (1)

former_member333910
Active Participant
0 Likes

Rajalekshmy, a few comments:

  • Your log shows you are running Data Hub version 6.6.0.0-RC4. Although I don't necessarily suspect your problem is directly related, I highly recommend using the latest patch version of Data Hub for your major version. At this time, I think the latest version of 6.6 Data Hub is 6.6.0.7-RC1.

  • This error is thrown because the item type of the Canonical item does not match a known item type in Data Hub. This should never happen and therefore leads me to suspect there is either something wrong with your Data Hub extension modeling or perhaps one or more Data Hub extensions were not loaded at the time of Data Hub startup. Was there anything in the startup log indicating a problem loading extensions?

  • You say this began after enabling clustering. With the minimal log excerpt you provided, I don't see any relation to this error and clustering. However, perhaps some cluster misconfiguration is indeed the root cause here.

Recommendations:

  • Review your Data Hub log prior to this error. Were there any errors during startup?

  • Can you confirm that if you disable clustering, this problem also stops? If so, we need to look closely at your cluster configuration to see what might be causing this.

  • You may post a larger excerpt of your log for us to review.