Robots.txt has the task to create a file that is completely the opposite of the site map to indicate that the pages are included. So having Robots.txt for each site is necessary. When a web crawler scans a site, it first locates the robots.txt file at the root level and after it has been crawled, it reads the crawler and finally finds files and directories that may be blocked.
Speed Checkup Tools
Check the speed of the webpage with different browsers and locations
Uptime Checkup Tools
Check the uptime status of the website, server and services
DNS Checkup Tools
A specialized tool to check the status of a DNS website instantly and online
SEO Checkup Tools
Review the site's SEO and compare SEO with rival sites
WHOIS Checkup Tools
Check the status of the domain and display its account profile
IP Checkup Tools
Check the uptime status of the website, server and services
SSL Checkup Tools
checks the SSL/TLS configuration and decode SSL/CSR in PEM format
XML Sitemap Generator
Google Cache Checker
Link Analyzer
Email Privacy
Color Picker