WebSite X5Help Center

 
Joe H.
Joe H.
User

What is the purpose of the 'files' folder that is viewed in the destination file  en

Autor: Joe H.
Besucht 1262, Followers 1, Geteilt 0  

I have duplicate web pages, a webpage located in the 'files' folder that is also found as well in the default/main directory.  Why is this?   Is this affecting my SEO when I'm told of my site having duplicate pages?  Google doesn't seem to like it, I've got notified about canonical issues and duplicate content and it is taking me back to what I mentioned above.

Can I delete the page from the 'files' folder and be OK?   

Another question, I see folders/files when I look in the destination file, found when I go to export download changes, how can I have more control of what I see/change/add/delete from it?  

Gepostet am
3 ANTWORTEN
Esahc ..
Esahc ..
Moderator

Joe, there are 2 ways you can get duplicate pages in WX5, the most obvious is when there are duplicates in your project. I ran foul of Google SEO some time back when I duplicated a number of pages to make life easier for myself. but neglected to edit the properties of each duplicate page.

If you go to step 5 and export project to disk you can browse through the output to look for so called duplications as well as check out the content of various folders generated by wx5.

The other way I found they can occur is when you delete pages in your project and then update your site (WX5 is also prone to leave orphaned folders when optional objects are updated or replaced). WX5 will replace modified pages and add new pages, but it will not delete any pages or files on your hosting from an earlier time.

In the past I have deleted all files on the host in the public folder and then carried out a full upload to the host to remove these redundant files.

As for folders in your public directory, generally these are created by WX5 and hold required resources of your website, it is unwise to individually delete files in these folders, but deleting all folders and doing a full upload will fill each with only currently required files.

Mehr lesen
Gepostet am von Esahc ..
Joe H.
Joe H.
User
Autor

Can I use my robot.txt to tell Google I don't want these files/folder crawled?  Example included.  Google is coming up with all these errors lately from my site.

Mehr lesen
Gepostet am von Joe H.
Esahc ..
Esahc ..
Moderator

Joe this should stop it but would you want to? On some of my sites the files folder holds many pdfs and some pictures, I want everything searchable for these files.

It would be better to clean out the folder and do a full upload so only required (current) files are in the files folder.

Mehr lesen
Gepostet am von Esahc ..