The standard edition of Robot-Manager allows the user to effortlessly create a robots.txt file for their web site. This file is used to help direct search engine and other types of spiders to appropriate pages once they arrive at your web site. Most spiders will only gather about 30% of your web site's content, so it's critical that you help them find the most appropriate pages on your site. At present, Robot-Manager can retrieve the directory structure of your web site from your local hard drive or via an ftp server. Once the file is created, Robot-Manager can upload it to the appropriate directory on your web server.
web sitesearch enginecreatedSoftwaredirectorystructureWindowsMobilecrawlerscriticalspidersstandardsofthard driveretrieveeffortlesslycontenteditiondirectory structurepresentthe directory
Operating Systems: • Windows XP • Windows 2000 • Windows ME • Windows NT • Windows 98