How can I do a bulk data load into my MySQL database?

repowoznicsPRO

9 months ago

That's close to 1 million insertions into a table in my database.

2 Replies

9 months ago

the same way you would for any other mysql database


repowoznicsPRO

9 months ago

N/A


repowoznicsPRO

9 months ago

The problem is that I'm using MySQL Workbench to connect to my database hosted on Railway, and I'm using the 'Table Data Import Wizard,' but it's taking too long. Is there a more efficient way to load the data into my table with better performance?

1266100120739381500


9 months ago

that's how you would want to do it, do you have slower internet?


repowoznicsPRO

9 months ago

I don't think my internet is bad, but it's not very fast either. I have a 50 Mbps connection


9 months ago

whats your upload?


repowoznicsPRO

9 months ago

12 Mbps


9 months ago

could take a while, just let it go and I'm sure it will finish


repowoznicsPRO

9 months ago

The problem is that I've made estimates and it will take 3 days to upload 252,000 records, and I have 1 million records, so this isn't an optimal path for me…

I've been researching and testing 'LOAD DATA INFILE,' and it has extraordinary performance, inserting 252,000 records into my table in less than a second. Is there a way to do this with a Railway database?


repowoznicsPRO

9 months ago

The problem is that I tested it locally, but when I try to do it with the Railway database, I get the following error: ErrorCode: 1290. The MySQL server is running with the --secure-file-priv option so it cannot execute this statement


9 months ago

the bottleneck here is your internet speed


repowoznicsPRO

9 months ago

Would the performance improve with a better internet connection, so there wouldn't be a need to use 'LOAD DATA INFILE'?


9 months ago

correct, the internet speed is the issue


repowoznicsPRO

9 months ago

Perfect, I'll try it with a faster internet connection and get back with updates. Thank you very much.


9 months ago

no problem!


lucasgarcia98HOBBY

9 months ago

Try upload the file into the same context from your database in railway, and using cli try exec command in terminal, maybe it resolve your problem


repowoznicsPRO

9 months ago

@Garcia How can I upload my CSV file to where my database is?


9 months ago

Instead of keeping your machine running all day to import the data.
You can run a background worker to import the data. That would be faster and make things easier for you.


repowoznicsPRO

9 months ago

@Brody Hi, I have some updates. I tried with a 1Gbps internet connection, and it did improve the performance, but it still takes around 30 hours (1 day and 6 hours) to load just 252,000 records, and I can't wait that long since I have 4 files…


repowoznicsPRO

9 months ago

@Yashu Sorry for my ignorance, but I don't know how to do that. Would it be too much trouble to provide a document or something similar to help me create it? I'm not very familiar with the term.


9 months ago

what is the actual size of the SQL file


9 months ago

Definition
Background worker is simple independent threads in the application running in the background. Generally, they run periodically to perform some tasks.


9 months ago

You can store/upload your sql/csv file to cloudflare r2 or aws s3. Then write the program in which ever language you're comfortable in (JS, Ruby, Python, Go).

The program can read the file data from the R2/S3 bucket (you can further split the files and run multiple processes) and then insert the data into db.


repowoznicsPRO

9 months ago

I have 4 CSV files, each weighing approximately 3MB


9 months ago

why would that take over a day, something is being done wrong


9 months ago

how are you importing them to the database


repowoznicsPRO

9 months ago

Literally, I'm doing it with the 'Table Data Import Wizard,' selecting the CSV file, matching the columns of my file with the columns of my table, and executing it.


9 months ago

please look for alternative ways to import CSV files


sdavid14HOBBY

3 months ago

load data infile would be much faster than running sql commands over a mysql connection, esp if it can be run from the server perspective.. ie. the mysql host process has access to the .csv file. it'd also let you copy a large zipped version of the sql file to the server, and unzipping/loading it from there. is this possible ?


3 months ago

It is not possible as we do not provide SSH access to the container.


How can I do a bulk data load into my MySQL database? - Railway Help Station