How can I do a bulk data load into my MySQL database?

repowoznics
PRO

a year ago

That's close to 1 million insertions into a table in my database.

2 Replies

a year ago

the same way you would for any other mysql database


repowoznics
PRO

a year ago

N/A


repowoznics
PRO

a year ago

The problem is that I'm using MySQL Workbench to connect to my database hosted on Railway, and I'm using the 'Table Data Import Wizard,' but it's taking too long. Is there a more efficient way to load the data into my table with better performance?

1266100120739381500


a year ago

that's how you would want to do it, do you have slower internet?


repowoznics
PRO

a year ago

I don't think my internet is bad, but it's not very fast either. I have a 50 Mbps connection


a year ago

whats your upload?


repowoznics
PRO

a year ago

12 Mbps


a year ago

could take a while, just let it go and I'm sure it will finish


repowoznics
PRO

a year ago

The problem is that I've made estimates and it will take 3 days to upload 252,000 records, and I have 1 million records, so this isn't an optimal path for me…

I've been researching and testing 'LOAD DATA INFILE,' and it has extraordinary performance, inserting 252,000 records into my table in less than a second. Is there a way to do this with a Railway database?


repowoznics
PRO

a year ago

The problem is that I tested it locally, but when I try to do it with the Railway database, I get the following error: ErrorCode: 1290. The MySQL server is running with the --secure-file-priv option so it cannot execute this statement


a year ago

the bottleneck here is your internet speed


repowoznics
PRO

a year ago

Would the performance improve with a better internet connection, so there wouldn't be a need to use 'LOAD DATA INFILE'?


a year ago

correct, the internet speed is the issue


repowoznics
PRO

a year ago

Perfect, I'll try it with a faster internet connection and get back with updates. Thank you very much.


a year ago

no problem!


lucasgarcia98
HOBBY

a year ago

Try upload the file into the same context from your database in railway, and using cli try exec command in terminal, maybe it resolve your problem


repowoznics
PRO

a year ago

@Garcia How can I upload my CSV file to where my database is?


Instead of keeping your machine running all day to import the data.
You can run a background worker to import the data. That would be faster and make things easier for you.


repowoznics
PRO

a year ago

@Brody Hi, I have some updates. I tried with a 1Gbps internet connection, and it did improve the performance, but it still takes around 30 hours (1 day and 6 hours) to load just 252,000 records, and I can't wait that long since I have 4 files…


repowoznics
PRO

a year ago

@Yashu Sorry for my ignorance, but I don't know how to do that. Would it be too much trouble to provide a document or something similar to help me create it? I'm not very familiar with the term.


a year ago

what is the actual size of the SQL file


Definition
Background worker is simple independent threads in the application running in the background. Generally, they run periodically to perform some tasks.


You can store/upload your sql/csv file to cloudflare r2 or aws s3. Then write the program in which ever language you're comfortable in (JS, Ruby, Python, Go).

The program can read the file data from the R2/S3 bucket (you can further split the files and run multiple processes) and then insert the data into db.


repowoznics
PRO

a year ago

I have 4 CSV files, each weighing approximately 3MB


a year ago

why would that take over a day, something is being done wrong


a year ago

how are you importing them to the database


repowoznics
PRO

a year ago

Literally, I'm doing it with the 'Table Data Import Wizard,' selecting the CSV file, matching the columns of my file with the columns of my table, and executing it.


a year ago

please look for alternative ways to import CSV files


sdavid14
HOBBY

5 months ago

load data infile would be much faster than running sql commands over a mysql connection, esp if it can be run from the server perspective.. ie. the mysql host process has access to the .csv file. it'd also let you copy a large zipped version of the sql file to the server, and unzipping/loading it from there. is this possible ?


5 months ago

It is not possible as we do not provide SSH access to the container.