Portal Home > Knowledgebase > Articles Database > robots.txt with wildcard?
Posted by tranceformer, 01-08-2008, 07:05 PM Googlebot is eating up about 30GB bandwidth per month. I want to prevent him (and other robots) from spidering: http://www.mysite.com/directory1/page1.php and all derivatives thereof (e.g. http://www.mysite.com/directory1/pag...riable1=value1) Is this possible with robots.txt? How would the code look? Will this work? User-agent: * Disallow: /directory1/page1.php or do I need to somehow specify a wildcard, e.g. User-agent: * Disallow: /directory1/page1.php?* Thanks for your help!
Posted by David, 01-08-2008, 07:11 PM Matt (if this is Matt!), You might have better luck on this forum here: http://www.webmasterworld.com/forum93/page2.htm They seem to be robots.txt gurus
Add to Favourites Print this Article