GPU Support

2 years ago

With our new runtime, it's now possible to do GPUs

This thread is to guage interest. Please upvote it and we will keep you updated

31 Replies

2 years ago

GPUs would go so hard 🔥🔥🔥


2 years ago

Very much possible

https://x.com/justjake/status/1772005308052943005?s=46&t=00dq48d1c_GjD9m8fXEJ8w

However, GPUs have a liquidity problem (expensive for us to just keep em idle)

So, if you're interested, please upvote and once we hit enough demand, we can look into rolling this out broadly


2 years ago

We need this!


imw
PRO

2 years ago

+1!


xander
PRO

2 years ago

would be able to try this out


admin-replies
PRO

2 years ago

+1


ve-jo
HOBBY

2 years ago

+1


2 years ago

+1


sts417
TRIAL

2 years ago

+1


tony-hunter
HOBBY

a year ago

+1


brentably
PRO

a year ago

+1


a year ago

+1


marcellov7
PRO

a year ago

+1


a year ago

Yesss +1


dzusooo
PRO

a year ago

+1


inngeniero
PRO

a year ago

+1


marcellov7
PRO

a year ago

there are news on this?

to use Ollama on Railway, it would be a perfect combo!


viktorlarsson
PRO

a year ago

I would really need this


thiagoscodelerae
PRO

a year ago

+1


paddasecurity
PRO

a year ago

+1


hypnobrew
PRO

10 months ago

+1000


iliefski
PRO

10 months ago

Hoping to move some GPU heavy services to railway ASAP.
+1


joshuanitschke
PRO

8 months ago

+1


jonahseguin
PRO

8 months ago

+1 need GPUs


scbeacham
HOBBY

8 months ago

I currently use railway (just on hobby plan at the moment but definitely would upgrade), but am now building AI Image Generation applications with need for GPU. Would definitely prefer to stick with railway as, of all the deployment platforms I've tried, you guys are the most streamlined and easy to use.

Do we have any update on GPU services becoming available on railway? is there enough interest/resources to make it worth it yet for you to "flip that switch"? (I'm aware the implementation is significantly harder than that, you guys just make everything look so easy, thank you for that by the way!)


geo-ops
PRO

4 months ago

+1


davidcoleman007
FREE

2 months ago

I also would use this. understand the liquidity issues. I am looking to use high fidelity image upscaling. i wonder if there's a secure shared model that could allow us to not need per-user dedicated instances? This can allow us to scale as needed, without incuring massive idle costs for you.


59023g
PRO

a month ago

Updates here?


jailsonpaca
HOBBY

a month ago

+1


ptav
HOBBY

13 days ago

We're building an AI tool that uses a combination of local and public LLMs. CPU only workflow is very slow for some use cases so will need to purchase GPU capability in the next 2-4 months. Would greatly prefer to stay within Railway if possible


douglas125
PRO

a day ago

I would have use cases for GPUs as well. My scale would not occupy it too much though


Loading...