# Google Lighthouse's erratically flags robots.txt as invalid,

but that's totally bs User-agent: * Disallow: /stamboom/ User-agent: * Disallow: /donotvisit.php User-agent: Googlebot Disallow: / User-agent: Google-Read-Aloud Disallow: / User-agent: Storebot-Google Disallow: / User-agent: Google-InspectionTool Disallow: / User-agent: GoogleOther Disallow: / User-agent: Google-Extended Disallow: / User-agent: APIs-Google Disallow: / User-agent: AdsBot-Google-Mobile Disallow: / User-agent: AdsBot-Google Disallow: / User-agent: Mediapartners-Google Disallow: / User-agent: Google-Web-Preview Disallow: / User-agent: Bingbot Disallow: / User-agent: AdIdxBot Disallow: / User-agent: BingPreview Disallow: / User-agent: DuckDuckBot Disallow: / User-agent: MicrosoftPreview Disallow: / User-agent: msnbot Disallow: / User-agent: FacebookBot Disallow: / User-agent: facebookexternalhit Disallow: / User-agent: facebookcatalog Disallow: / User-agent: InstagramBot Disallow: / User-agent: Instagram-Embed Disallow: / User-agent: meta-webindexer Disallow: / User-agent: meta-externalads Disallow: / User-agent: meta-externalagent Disallow: / User-agent: meta-externalfetcher Disallow: / User-agent: OAI-SearchBot Disallow: / User-agent: GPTBot Disallow: / User-agent: AliyunSecBot Disallow: / User-agent: Baiduspider Disallow: / User-agent: Baiduspider-image Disallow: / User-agent: Baiduspider-video Disallow: / User-agent: Baiduspider-news Disallow: / User-agent: Baiduspider-favo Disallow: / User-agent: Baiduspider-ads Disallow: / User-agent: Baiduspider-cpro Disallow: / User-agent: GenomeCrawlerD Disallow: / User-agent: NetcraftSurveyAgent Disallow: / Sitemap: https://thewww.top/sitemapindex.xml Sitemap: https://stamboom.top/sitemap.txt Sitemap: https://stamboom.top/sitemap.php Sitemap: https://stamboom.top/sitemap.xml Sitemap: https://stamboom.top/sitemap-1.txt Sitemap: https://stamboom.top/sitemap-2.txt Sitemap: https://stamboom.top/sitemap-3.txt