cancel
Showing results for 
Search instead for 
Did you mean: 
Read only

Accelerator SEO Directives and HTTPS

Former Member
0 Likes
1,412

Hi experts,

we were wondering why accelerator by default does not allow indexing of it's pages:

 <meta name="robots" content="noindex,follow">

Having a look at documentation at https://help.hybris.com/6.3.0/hcd/8aee096786691014aff5a7f77ab200ff.html it seems SEO logic in accelerators up to 6.3 still expects the site running on HTTP:

  • The page is a secure GET --> NoIndex, Follow

  • The page is an insecure GET --> Index, Follow

Since accelerator completely defaults to HTTPS for a couple of versions now (which definitely makes sense in terms of security and google penalties), we are wondering if it's intended to change this dated default behavior in an upcoming version?

Regards Norbert

View Entire Topic
andyfletcher
Active Contributor
0 Likes

Yeah seems like a bug in the SeoRobotsFollowBeforeViewHandler

We always end up changing this file to suit our needs at the start of each project.

I would have raised it with Hybris but I don't know how to raise tickets anymore since they stopped using Jira and moved to SAP! :(

Former Member
0 Likes

hi andrew, thanks for your reply!

I'm just wondering how a meaningful fix could look like. as http/https can no longer be used as criteria - do you rely on spring security? or simply setting index, follow for https pages, too?

Regards, Norbert

andyfletcher
Active Contributor

Yeah we just allow indexing and following for https pages (unless there are specific reasons for not doing so, e.g. duplicate content)

Basically removing this bit

 if (request.isSecure())
 {
     robotsValue = ThirdPartyConstants.SeoRobots.NOINDEX_FOLLOW;
 }