I was recently involved in a project using machine translation of a website (in a design role, so don’t know much about the DeepL integration itself; not a Kirby project). From what I observed, this can get very complex really quick.
Since translations often require a bit of cleaning up (even just for markup), it is essential to have some system in place where changing one sentence on a page does not erase all the manual fixes done to the rest of the translated content; so detecting changed sentences, sending only those to the translator (also saves money, AFAIK DeepL charges by characters) and then marking those sections as “not yet manually checked” in the translated version.
Just thinking out aloud:
If all you need is a fully-automated translation of the final result (and cost is not an issue due to small scale and ideally rarely changed content and design), easiest would probably be to translate, cache and send out the final rendered HTML. One approach could be to deal with the “translated” URLs via a route, which then returns pre-generated translated versions from such cache. Not very economical, though, as every change to headers/footers etc. would require a complete re-translation of all pages; this is more an approach for a site that is barely ever updated (e.g. an e-book type publication or the like).
The most elegant version would be to set up the target languages in Kirby and use page create/change hooks to update the language versions of the primary language’s content on the fly (depending on the speed of the API potentially with a backlog/cronjob setup instead of doing it synchronously). So you would actually fill the Kirby content files in the various languages with the auto-translated content, field by field. A significant challenge is that DeepL cannot deal with Markdown/Kirbytext, i.e. the content would have to be transformed to XML/HTML and then reverse with the API’s return.