Are you waiting for your new site to get indexed?
Google says you don’t have to wait long.
All you have to do is go into Google Search Console and resubmit a new robots.txt file.
How to Force a Robots.txt file Reprocessing
According to Google’s John Mueller, all you have to do is go to Search Console, go to the Crawl section, and click “robots.txt file Tester.” From there you can submit your new, updated robots.txt file for indexing.
Here is an image for reference:
Mueller’s comment appeared in the Webmaster Central Help forum last Friday. A user’s brand new website was not showing in Google’s search results after already having submitted his robots.txt file.
Here’s the whole comment:
One small thing you can do to force a change in the robots.txt file to be reprocessed (usually we do this about once a day, depending on the website) is to use the robots.txt testing tool in Search Console. There you’ll see the current known version, and you can submit your new one for reprocessing, if you’ve since made changes.
In case you want to read the comment in context, here’s the full thread: https://productforums.google.com/forum/#!topic/webmasters/McFuFb4iOoY/discussion