a year ago
Hi, I am importing a lot of data in a daily basis, but the process takes too long. Before I had solved it uploading a file and then executing COPY FROM and pointing to that file.
Is it possible to do something similar or do you know any other solutions?
3 Replies
a year ago
To speed up your data import process, you can use the COPY command in PostgreSQL, which is efficient for bulk data loading. However, Railway does not support direct file uploads for COPY FROM due to security and infrastructure constraints.
Here are a few alternatives you can consider:
Use a Remote File: If your data is accessible via a URL, you can use
COPY FROM PROGRAMto fetch the data directly from a remote source.Stream Data: Use a script to stream data into your PostgreSQL database using a tool like
psqlwith the\copycommand, which allows you to copy data from a client-side file.Optimize Your Database: Ensure your database is optimized for bulk operations. This includes disabling indexes and constraints during the import and re-enabling them afterward.
For more details on deploying and managing PostgreSQL on Railway, check out our PostgreSQL guide.
If you have any specific constraints or need further assistance, feel free to ask!
Status changed to Awaiting User Response Railway • about 1 year ago
a year ago
Hi Angelo,
I tried using COPY FROM PROGRAM, but it looks like curl is not installed. This is the query that I am using
query=""" COPY link_products
FROM PROGRAM '/usr/bin/curl -s "https://file_url.com/file.csv"'
WITH (FORMAT csv, HEADER)"""
I also tried with just curl instaed of /usr/bin/curl and nothing.
Regards,
Ben
Status changed to Awaiting Railway Response Railway • 12 months ago
a year ago
Please can you try using psql?
Status changed to Awaiting User Response Railway • 12 months ago
6 months ago
This thread has been marked as solved automatically due to a lack of recent activity. Please re-open this thread or create a new one if you require further assistance. Thank you!
Status changed to Solved Railway • 6 months ago