I am looking for advice on how to make a shinyapp recognizable by search engines.

I understand that the app remains inactive until someone clicks on the app, which will instantiate a session of the app that will last for whatever option you have. I understand that Google is unable to crawl for the app's content, but the app cannot even be found by the search engine (even the title not just the content).

An alternative I 've looked is by embedding an iframe into my static website, but my app has multiple tabs so I don't really think this is possible!

I am hosting into shinyapps.io with a basic plan.

I would also be interested in an answer!

We recently had another discussion, to what extent search engines are able to visit sites at shinyapps.io. App deployed to shinyappsio keeps waking up with no user connection

So far it seems nobody really knows?!
However, when I search for my username in combination with the app-name (the parts before and after shinyapps.io) nothing is found, although the app is online for 9 months or so.

I think it would be awesome if it would be possible to upload a file together with the app that is loaded before the app even starts, telling the crawlers what they could find there, or on the other hand if they should move on and there is nothing of interest for them.

I mean something like the robots.txt or sitemaps. Just containing the name of the app, the name of the author and maybe a short description or a few keywords.

R-Studio developers, is this possible?

I found a workaround but it is not really a solution.

I created an empty index.html that only contains an iframe with the app url and hosted it in the gh-pages in github. I assume that the crawler gets into the static webpage and crawls the url, and make the url of the initial app (not the static webpage) visible in the search engine. However, it is impossible to add any description, keywords or anything else to the google results. I tried meta tags in both the static webpage and the app but nothing happened.

Again, this is an assumption of how things worked. I really looking forward to a more sophisticated solution.

Is there a more recent update on this?

Hi,

I was facing the same problem with my Shiny App and I just discovered via my Google Search Console account that the Shiny App was containing a blocking file for Google indexation, namely a "robots.txt" which prevent robots to explore the app.

I assume this default setting is meant to save the active hours available for individual apps in shinyapps.io, otherwise robots would spend it and that would be sad.

Nevertheless I don't know how to remove this "robots.txt" file, if we can. I am going to send an email to shinyapps.io support. I think that Google indexation could bring new users to an app and encourage the owners to upgrade their "plan" (starter -> basic -> standard ...) so everyone would be glad :slight_smile:

Hi all,

Unfortunately, at this time, there isn't an ability to modify the robots.txt for shinyapps.io applications. We currently block crawling deliberately because we found that the robot crawlers would cause applications to start, but would not follow the sockjs connections to get actual content. Because of this, we were paying the cost of starting the application, waiting for it to become idle, and then stopping, without getting any benefit from the crawl.

That said, we are looking into options to allow this in the future in some fashion and we'll see what we can do going forward.