Protected pages and Google search
Author: Han W.
Visited 1840,
Followers 1,
Shared 27
How can I prevent that protected pages can be visited as result of a Google search?
Posted on the
Hello,
In step 2 you can select the protected page, click on the page properties tab expert and uncheck to "insert this page in the Sitemaps".
Author
Hi Luca,
I do not see this option under tab Expert: I have version 8...could it be that this option is only available in version 9?
Thnx, Han
Version 8......
Map Creation/Page Properties/Expert
Update Frequency: Never
Contents Priority: 0 - Very Low
.
www.frankscybercafe.com
.
Author
Hi,
Thanks for your help...but one more question: does this solution results in not uploading the protected pages at all? I do want to upload the protected pages, but they only should be accessable via the password, not by a google search. Is your solution solving this?
Thanks!
Han
This above tells the search engines what you want them to do..... Any page in Sitemap Creation whether "Locked" or "Hidden" will be uploaded!!
If your pages are "Locked" anyone trying to access these pages are confronted by the "Login" anyhow!!
.
www.frankscybercafe.com
.
Author
Hi,
Thanks for your help! Your last statement however doesn't seem to be correct. This was exactly the problem I ran into: a Google search on the content of a protected page brings you directly to that page without login screen....
I'll try your sollution and hopefully the issue is solved....thanks again!
If you have logged in then gone looking at Google you will be automatically let into the locked page.... (There is a logout button in V9 now) and also you shouldn't be able to gain entry "AFTER" you have shut down your browser?? This usually works in V8!
Just have to try it like you said......
Author
Hi,
Looks like your solution solved the issue!
Thanks!!!!....
Generally using correctly configured robot.txt is considered the best practice by Google.
In robot.txt you can specify what content you want Search engine crawlers to read and what files and directories are off limits.
That is correct Max M. but how many would you suspect bypass the robots.txt?
Personally I wouldn't trust any of them!!
Would you Max?
I do not know. I would trust Google. if we can't trust google then we are doomed.
;-)
Max all the big names on the net have just revised their privacy laws.... Even Google was reported to have invaded our privacy!! Facebook is now facing prosecution because of this..... There's no "TRUST" on the net in my view!!
You probably right.
A method that is going to work for sure would be to place all you restricted pages in a directory and then edit your .htaccess file.
For example, to restrict your home/public_html/restricted_dir/
create .htaccess in restricted_dir with the following content:
AuthName "Restricted Area"
AuthUserFile "/home/yourhome/mypasswords/public_html/restricted_dir/passwd"
AuthType Basic
require valid-user
Also create passwd in /home/yourhome/mypasswords/public_html/restricted_dir/
with users and encrypted passwords.
There is good step by step instructions on how to do it at
http://www.addedbytes.com/lab/password-protect-a-directory-with-htaccess/
P.S.
there is encrypt password tool on the bottom of the page
Look in the Control Panel for your server and it will do this for you automatically!!
(Assuming you have a reputable Hosting Company that is!!)
.
www.frankscybercafe.com
.