-
Notifications
You must be signed in to change notification settings - Fork 200
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Insert thousands or millions of rows in bulk #343
Comments
I found this optimization with prepared statements:
I can reach 250000 inserts/sec on Ruby 3.0 / MacOS Monterey 2,2GHz i7. I was hoping for something more due to the large volume of data that I would like to process in this way. Is there any better solution? |
I don't think there's a better solution using this gem. I'll leave this open in case any of the more seasoned maintainers want to weigh in with advice. |
Expand the statement to insert multiple rows at once
This ensures you are not going back and forth a lot between SQLite and Ruby Prepare two statements, the above and one with just a single row insert to capture any leftovers Make sure your transactions fit in the page cacheDon't just insert all the records in a single transaction, make sure you select a decent size of inserts per transaction (should be thousands) and make sure your page cache can fit all these rows Use WAL mode with synchronous = 1If you are not doing so already Try to insert ids in orderIf that is possible with your data set, make the ids ordered before feeding the records to SQLite |
What is the fastest way to batch insert a large number of rows in SQLite using this gem?
Currently I use this:
The items come from a large files and need to be inserted in the SQLite database as fast as possible.
execute_batch
, however I can't find a way to sanitize / interpolate the statements in this case. Is there a method likedb.sanitize(sql, [id, name])
?The text was updated successfully, but these errors were encountered: