Spawning Multiple Processes in Node
jiito
PROOP

7 months ago

Context

We are working on a data pipeline that downloads videos and converts them using ffmpeg.

We are hoping to do this in parallel by spawning multiple processes (in node.js).

We attempted to process around 12 videos at the same time and we ran into EAGAIN errors from the processes. My understanding is that these are due to resource constraints, but based on our project's Metrics page, we didn't seem close to the limit (image below)

I'm not too familiar to spawning multiple processes in Node.js so any help is greatly appreciated.

Solved

3 Replies

brody
EMPLOYEE

7 months ago

Hello,

It is possible you are running into PID limits.

Can you integrate some debug logging during heavy load that will print the contents of the /sys/fs/cgroup/pids.current file?

Once you have this data, we can go from there.

Best,
Brody


Status changed to Awaiting User Response Railway 7 months ago


jiito
PROOP

7 months ago

It does appear that we are running into process limits.

Here is some output from the file while the heavy load was occurring.

```
root@:/app# cat /sys/fs/cgroup/pids.current

24

...

root@:/app# cat /sys/fs/cgroup/pids.max

1000

...

root@:/app# cat /sys/fs/cgroup/pids.current

157

root@:/app# cat /sys/fs/cgroup/pids.current

157

root@:/app# cat /sys/fs/cgroup/pids.current

390

root@:/app# cat /sys/fs/cgroup/pids.current

390

root@:/app# cat /sys/fs/cgroup/pids.current

bash: fork: retry: Resource temporarily unavailable

bash: fork: retry: Resource temporarily unavailable

bash: fork: retry: Resource temporarily unavailable

bash: fork: retry: Resource temporarily unavailable

^Cbash: fork: Interrupted system call

root@:/app# cat /sys/fs/cgroup/pids.current

bash: fork: retry: Resource temporarily unavailable

bash: fork: retry: Resource temporarily unavailable

bash: fork: retry: Resource temporarily unavailable

895

root@:/app# cat /sys/fs/cgroup/pids.current

794

```

How would you suggest we remedy this? Is it possible to set pids.max? Or should we limit to concurrency to stay under this load?


Status changed to Awaiting Railway Response Railway 7 months ago


brody
EMPLOYEE

7 months ago

Hello,

You would want to limit concurrency to stay within a safe margin from the 1000 PID max as we can only increase the max on the Enterprise plan at this time.

Best,
Brody


Status changed to Awaiting User Response Railway 7 months ago


Railway
BOT

6 months ago

This thread has been marked as solved automatically due to a lack of recent activity. Please re-open this thread or create a new one if you require further assistance. Thank you!

Status changed to Solved Railway 6 months ago


Loading...