on ‎2017 May 12 9:38 AM
Hi experts,
we were wondering why accelerator by default does not allow indexing of it's pages:
<meta name="robots" content="noindex,follow">
Having a look at documentation at https://help.hybris.com/6.3.0/hcd/8aee096786691014aff5a7f77ab200ff.html it seems SEO logic in accelerators up to 6.3 still expects the site running on HTTP:
The page is a secure GET --> NoIndex, Follow
The page is an insecure GET --> Index, Follow
Since accelerator completely defaults to HTTPS for a couple of versions now (which definitely makes sense in terms of security and google penalties), we are wondering if it's intended to change this dated default behavior in an upcoming version?
Regards Norbert
Help others by sharing your knowledge.
AnswerRequest clarification before answering.
Yeah seems like a bug in the SeoRobotsFollowBeforeViewHandler
We always end up changing this file to suit our needs at the start of each project.
I would have raised it with Hybris but I don't know how to raise tickets anymore since they stopped using Jira and moved to SAP! :(
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Yeah we just allow indexing and following for https pages (unless there are specific reasons for not doing so, e.g. duplicate content)
Basically removing this bit
if (request.isSecure())
{
robotsValue = ThirdPartyConstants.SeoRobots.NOINDEX_FOLLOW;
}
| User | Count |
|---|---|
| 2 | |
| 2 | |
| 1 | |
| 1 | |
| 1 | |
| 1 | |
| 1 | |
| 1 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.