You can make your sitemap available to Google by adding it to your robots.txt file or directly submitting it to Search Console, but Sitechecker bot checks only robots.txt file. If you have submitted the sitemap via Google Search Console, but have not added it to robots.txt, the site-level issue will still be active.
A sitemap is a file where you provide information about the pages, videos, and other files on your site, and the relationships between them. Search engines like Google read this file to more intelligently crawl your site.
A sitemap tells Google which pages and files you think are important in your site, and also provides valuable information about these files: for example, for pages, when the page was last updated, how often the page is changed, and any alternate language versions of a page.
If your site’s pages are properly linked, Google can usually discover most of your site. Even so, a sitemap can improve the crawling of larger or more complex sites, or more specialized files.
Robots.txt file does not contain a link to XML sitemap file check is good for your website but not enough to get good SERP positions!
To detect not only the issue but other kind of site level and page level problems, just make the full site audit.