Hiding website from search engines until future date

I am working on an “Archive” that will will be used internally and only via the Kirby panel for the next 6 months (ie. archive.com/panel). After this, the Archive will become a public facing website with 100s of pages.

Until the templates are made and the website is launched to the public, ideally there would not be any public pages and nothing indexed by google.

Because each page will still need to have a template should I make a default template that is just a “Coming Soon” page? Or would it be better to have a redirect so that I don’t have duplicate content?

I can use the <meta name="robots" content="noindex" /> tag, however I am unsure of the effects this may have in the long term as in the future good SEO is desired.

I understand this is a bit more of a broad web dev question, but just thought I would post it here in case there was any possible Kirby solutions, or if someone else had dealt with something similar.

If you put the site behind basic auth, nothing will leak into search results. I wouldn’t rely on meta robots or robots.txt.

Thanks @texnixe, that sounds like a good way to go.