Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] Out of memory when extracting large CSV from DCAS #389

Closed
danangmassandy opened this issue Jan 24, 2025 · 0 comments · Fixed by #395
Closed

[Bug] Out of memory when extracting large CSV from DCAS #389

danangmassandy opened this issue Jan 24, 2025 · 0 comments · Fixed by #395
Assignees
Labels
bug Something isn't working

Comments

@danangmassandy
Copy link
Collaborator

danangmassandy commented Jan 24, 2025

From testing #380, I have run the pipeline and found out that duckdb failed to extract large CSV file and causes out of memory.

As an alternative solution, we could split into smaller csv files and ask Brian whether they are okay receiving multiple csv files. Once splitted, we could also merge them into 1 large csv file before sending into SFTP.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant