WebSite X5Help Center

 
Adrian B.
Adrian B.
User

Exclude cart card contents from robots  en

Autor: Adrian B.
Visitado 633, Seguidores 1, Compartilhado 0  

My SEO tools are flagging duplicate content residing in the description area of the cart cards.

Any way to block that with a line added to robots.txt?

Sample attached

Thank you

Publicado em
1 RESPOSTAS
Daniel W.
Daniel W.
User

If two products are almost the same, why shouldn't the description be almost the same, unless you're trying a word swap game.

I'm not an SEO professional and I asked Google about it once. Here is what I think is an appropriate result.

----- English (translated with Google) -----

How you should deal with duplicate content on your website

Don't set URLs to noindex and don't block anything using robots.txt!

That would be the worst thing you could do. Because Google is very good at detecting duplicate content and, above all, dealing with it. If the search engine finds several versions of a website, the one that Google favors will be included in the index. Google decides on a version.

----- German (original website) -----

Wie Du mit Duplicate Content auf Deiner Website umgehen solltest

Setze keine URLs auf noindex und blockiere nichts mittels robots.txt!

Das wäre das Schlechteste, was Du tun könntest. Denn Google kann sehr gut doppelten Inhalt entdecken und vor allem damit umgehen. Sollte die Suchmaschine mehrere Versionen einer Website finden, dann kommt diejenige in den Index, die Google favorisiert. Google entscheidet sich für eine Version.

>> https://seoagentur-hamburg.com/glossar/was-ist-duplicate-content/

----------------------

Ler mais
Publicado em de Daniel W.