Build a correct robots.txt with crawl rules and your sitemap. Wrong directives can deindex a site — this keeps it valid.
User-agent: * Disallow:
Upload as /robots.txt at your domain root.
JW Digital audits crawl rules, indexation and technical SEO so your pages are found — not accidentally blocked.