5 months ago
Hi, I am importing a lot of data in a daily basis, but the process takes too long. Before I had solved it uploading a file and then executing COPY FROM and pointing to that file.
Is it possible to do something similar or do you know any other solutions?
3 Replies
5 months ago
To speed up your data import process, you can use the COPY
command in PostgreSQL, which is efficient for bulk data loading. However, Railway does not support direct file uploads for COPY FROM
due to security and infrastructure constraints.
Here are a few alternatives you can consider:
Use a Remote File: If your data is accessible via a URL, you can use
COPY FROM PROGRAM
to fetch the data directly from a remote source.Stream Data: Use a script to stream data into your PostgreSQL database using a tool like
psql
with the\copy
command, which allows you to copy data from a client-side file.Optimize Your Database: Ensure your database is optimized for bulk operations. This includes disabling indexes and constraints during the import and re-enabling them afterward.
For more details on deploying and managing PostgreSQL on Railway, check out our PostgreSQL guide.
If you have any specific constraints or need further assistance, feel free to ask!
Status changed to Awaiting User Response railway[bot] • 5 months ago
5 months ago
Hi Angelo,
I tried using COPY FROM PROGRAM, but it looks like curl is not installed. This is the query that I am using
query=""" COPY link_products
FROM PROGRAM '/usr/bin/curl -s "https://file_url.com/file.csv"'
WITH (FORMAT csv, HEADER)"""
I also tried with just curl instaed of /usr/bin/curl and nothing.
Regards,
Ben
Status changed to Awaiting Railway Response railway[bot] • 5 months ago
4 months ago
Please can you try using psql?
Status changed to Awaiting User Response railway[bot] • 4 months ago