SEO only index.php

Hi
My issue is I lunched a website but google is only indexing the subpages projects/__.php. Do you know a trick to reference only index.php ? I did some research and I found this code, what do you think of that ?

<FilesMatch "\.php$">
    Order Allow,Deny
    Deny from all
</FilesMatch>
<FilesMatch "index[0-9]?\.php$">
    Order Allow,Deny
    Allow from all
</FilesMatch>

nice typo, hope it was delicious :joy:

Sorry, no answer to your question…

1 Like

Yeah that’s all the tragedy of french people…I would say ‘launched’, ‘initiated’.
Ok I’m waiting to see if anyone can give me an answer

I don’t quite see why you want to index the index.php file, isn’t your site accessed with just the domain https://example.com? If you don’t want the subpages to be indexed at all (why not?), use a robots.txt and/or reroute the subpages.

I’m using the Robots.txt for Kirby 2.0.x. How to disable the subpages ?

User-agent: *
Disallow: /content/*.txt$
Disallow: /kirby/
Disallow: /site/
Disallow: /panel/
Disallow: /*.md$

Maybe this article is interesting for you: https://yoast.com/prevent-site-being-indexed/

Ok thanks I’ll try the X-Robots-Tag as recommanded

While a robots.txt tells the search engines to not index some of the pages, I don’t like to add critical uris there.

For example I would not add Disallow: /content/*.txt$. Even if the search engines are not recommended to index these pages, it’s not like they are blocked in any way. robots.txt is an open file to everyone. If a hacker visits the robots.txt file, you tell him/her that there are probably textfiles in the /content/ folder and that the site runs with Kirby CMS for example.

Yes, Jens is right. In fact, Kirby already blocks direct access to all files in the content and site folders via the .htaccess file, so there is in fact no need to do that in the robots.txt. In addition to that, you should make sure that directory listings are disabled on your server, because this does not seem to be the default setting with all hosting providers.