search - How to disallow specific pages in robots.txt, but allow everything else? -


What's the way to do this? I have pages like:

   Code> mydomaink.com/a/123/group/4 mydomaink.com/a/xyz/network/ google / group / 1   

I do not want to allow them to appear on Google .

Your robots.txt looks correct if you want to ensure 100%, you can.

FYI, robots.txt doe s no will not appear in the search results guaranteed to block pages. It prevents search engines from crawling those pages. They can still list them if they want you to use the HTTP header to prevent a page being indexed and listed.

If you use Apache, you can put a file in your / a / directory from the following line to effectively block those pages:

  & lt; IfModule mod_headers.c & gt; Header set X-robot-tag: "No indicks" & lt; / IfModule & gt;    

Comments

Popular posts from this blog

php - PDO bindParam() fatal error -

logging - How can I log both the Request.InputStream and Response.OutputStream traffic in my ASP.NET MVC3 Application for specific Actions? -

java - Why my included JSP file won't get processed correctly? -