Show Navigation
Conversation
Notices
-
@shnoulle There _should_ be a max time limit for remote calls with the http library already. Maybe this is disabled already or some setting is misinterpreted/-configured.
Thanks for looking stuff up, I'll be heading back home soon to look into it.
-
@shnoulle Well there is a configurable option for timeout (not connection timeout, i.e. establishing it) but it doesn't seem to be enabled by default in the HTTP library: 'timeout' => 0,
I know I've been fiddling with this at some time and knowing full well that there needs to be a timeout for the whole connection (a famous, now mostly taken care of, httpd DoS attack is to just reeeeaaaalllyyy sloooowwwlllyyyy byte by byte transfer a simple HTTP request, which can easily cause a zillion open connections).
I'll have a look if there's something set somewhere that I've missed so far. (I enjoy having !GNUsocial debugging sessions in the !fediverse btw .])
-
...to follow up on my comment above, the fallback was always default_socket_timeout which is by default 60s in PHP... Made it clearer for configuration anyway in a not-yet-pushed commit.
-
@shnoulle HTTP_request2 behaves as it should. Seems like PHP has a bug. Only lib I haven't checked is Net_Socket
-
@shnoulle 60 is the original default. Nothing should ever take more than 60 seconds to complete for a webserver that remote users can initiate requests for (i.e. link stuff that gets looked up by your !GNUsocial server). If it was up to me I'd set it to 30 seconds or something.
oEmbed etc, which in combination with StoreRemoteMedia does remote downloads, has an upper limit anyway on how large files to download (so 30 seconds should be reasonable unless your server is on a really slow connection).