WebSite X5Help Center

 
Adrian B.
Adrian B.
User

Exclude cart card contents from robots  en

Autor: Adrian B.
Visited 304, Followers 1, Udostępniony 0  

My SEO tools are flagging duplicate content residing in the description area of the cart cards.

Any way to block that with a line added to robots.txt?

Sample attached

Thank you

Posted on the
1 ODPOWIEDZI
Daniel W.
Daniel W.
User
Najlepszy Użytkownik miesiąca EN

If two products are almost the same, why shouldn't the description be almost the same, unless you're trying a word swap game.

I'm not an SEO professional and I asked Google about it once. Here is what I think is an appropriate result.

----- English (translated with Google) -----

How you should deal with duplicate content on your website

Don't set URLs to noindex and don't block anything using robots.txt!

That would be the worst thing you could do. Because Google is very good at detecting duplicate content and, above all, dealing with it. If the search engine finds several versions of a website, the one that Google favors will be included in the index. Google decides on a version.

----- German (original website) -----

Wie Du mit Duplicate Content auf Deiner Website umgehen solltest

Setze keine URLs auf noindex und blockiere nichts mittels robots.txt!

Das wäre das Schlechteste, was Du tun könntest. Denn Google kann sehr gut doppelten Inhalt entdecken und vor allem damit umgehen. Sollte die Suchmaschine mehrere Versionen einer Website finden, dann kommt diejenige in den Index, die Google favorisiert. Google entscheidet sich für eine Version.

>> https://seoagentur-hamburg.com/glossar/was-ist-duplicate-content/

----------------------

Czytaj więcej
Posted on the from Daniel W.