For starters your robots.txt is WRONG!This could be causing Googlebot issues crawling your blog.Those aren’t sitemap files!Go into your blogs settings & turn off the “Custom robot.txt”.Blogger knows better than you do what should & should not be crawled.Hello,
here’s info from official google page:
Crawling and indexing are processes which can take some time and which rely on many factors. In general, we cannot make predictions or guarantees about when or if your URLs will be crawled or indexed.
Keep in mind that while a sitemap file can help us learn about your site, it does not guarantee indexing or increase your site’s ranking.
source: Google Search crawling and indexing FAQ
Your robots.txt file is invalid. Go to your dashboard https://www.blogger.com -> Settings tab -> Crawlers and indexing -> disable “Enable custom robots.txt”. The default setting is optimal.
You can also read this:
URL Inspection ToolAsk Google to recrawl your URLs