RStudio Cloud - Trouble opening Project & Connection Timeout

Hi,

since a few days I have problems opening my (private) project Posit Cloud. For usual, the workaround with "delete -> restore" works. unfortunately, today even this didn't help.

I tried to relaunch the Project a few times and the project opened once. After opening, I wanted to go to the Global Settings which caused a total crash and showed me an "Connection Timeout".

image

Now right after the crash, the project is again not opening anymore.

Could you please have a look into this?

EDIT: Opening the project works and I don't get Connection Timeout anymore. I trashed and restored the Project a few times again.

Thanks and cheers
Alex

Hi all,

I don't want to open a new topic, but the problem still comes back every day. Yesterday I was never able to access my project and this morning it doesn't look any better. Now, I can't access any project in my workplace, even if they contain like one script and 3 lines of code.

Could you please look into this?

EDIT: 5 hours later, I can access the project again, but running any scripts results in various error messages:

image image image image
These errors occur all when trying to run the same function.
After getting the message, the workspace reloads and is empty again. Do you have any idea what could be happening here?

Many thanks!
Alex

The messages in the first message (connection timed out) I have never seen before. The IDE is connected through an iframe and we have had cases where a particular browser version, browser plugin, or intervening firewalls have blocked the connection. Are you able to provide your browser version and whether or not you are using any plugins?

The 2nd set of message are a bit more typical. They tend to indicate that the R session is either busy or has crashed. My guess is that the process that you are running either has an infinite loop or is exceeding the memory that is allocated to the container in rstudio cloud (which is 1GB).

Sean

1 Like

Hi Sean,

thanks for your answer. I'm using the newest version of Chrome without any plugins:
image

Maybe it's my companies firewall, but it sometimes doesn't work from home as well.

To 2:

My loop has max 6 chunks, but I also use doParallel. Maybe that is the problem and leads to a big use of memory. Otherwise the data I handle is not even 6mb and the function doesn't seem to be that huge.

Alex

The doParallel could definitely be part of the problem. I believe that package (and a couple others) don't properly identify the number of cores available to the container on rstudio.cloud - which historically has caused a number of unusual errors.

Ahh OK, I'll try it without the doParallel then. Thank you very much!

Is there anything that I can do else of the Trash -> Restore workaround if projects are not loading?

Until we get to the bottom of the issue(s), creating a post here (with a project URL if you've got it, and/or a screenshot if you don't, along with any errors you're seeing) is the best thing to do. Sorry for the trouble with this -- we know it's frustrating -- and thanks for your patience while we work to resolve it.

Mel

1 Like

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.