-
Notifications
You must be signed in to change notification settings - Fork 75
Can't insert more than 657 records into a table #45
Comments
Yes, I have inserted hundreds of thousands of records. You might just try batching them using the technique I wrote about here: speeding-up-bulk-loading-in-postgresql Basically, use: |
@burggraf Thank you for your comment. I have successfully inserted the records with bulk insert. But is there any way that doesn't change SQL itself? |
There's no reason why large sql inserts would not work other than resource limitations on your browser. We're running a 32-bit Linux vm in the browser and then loading PostgreSQL inside of that. You might try a larger memory setting for Postgres WASM and see if that helps a bit. |
I have changed the memory setting from 128mb to 1024mb, but the problem wasn't solved. 128mb
256mb
512mb
1024mb
|
I have set When the error occured, the following messages wrote to
|
What I want to do
I want to insert more than 1000 records into a table to see the performance of Postgres WASM.
What I did
I have a file to load data to a table like https://gist.github.com/satob/421f19ed438a9abe56b7139022df44d2 .
I have uploaded the file with [Transfer Files] and run
\i /mnt/test.sql
on thepsql
console.What I expected
All
INSERT INTO
statements are executed and theemployee
table has 1,000 records.What I got
psql
returns an error with the following message:The
employee
table only has 657 records.After the error, I couldn't insert any records into the table.
Question
How can I insert more than 1000 records into a table on Postgres WASM?
The text was updated successfully, but these errors were encountered: