Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

panic: Segmentation fault at address 0x325AE8FB8 #13885

Closed
ImBIOS opened this issue Sep 11, 2024 · 4 comments
Closed

panic: Segmentation fault at address 0x325AE8FB8 #13885

ImBIOS opened this issue Sep 11, 2024 · 4 comments
Labels
crash An issue that could cause a crash linux An issue that occurs on Linux runtime

Comments

@ImBIOS
Copy link
Contributor

ImBIOS commented Sep 11, 2024

How can we reproduce the crash?

As per this issue was first created, IDK what caused this, and it's happened for the first time. Maybe if it happen several times, I might know how to reproduce it.

My condition when this panic happened, I was scraping some (more than 1000) web with puppeteer-cluster, for my research.

UPDATE 1: It never happened before, I've been scraping some (less than a thousand) pages each scraping run, this whole 2 months. It's my first time scrape more than 1000 pages in a single run, this whole 2 months. Using Bun, I love ❤️ bun.

Relevant log output

============================================================
Bun v1.1.27 (267afa29) Linux x64
Linux Kernel v6.8.0 | glibc v2.39
CPU: sse42 popcnt avx avx2 avx512
Args: "bun" "index.ts" "-b" "true" "-l" "trace"
Features: jsc Bun.stdin(2) Bun.stdout bunfig dotenv(536) fetch(68) spawn(360) transpiler_cache(15) tsconfig_paths tsconfig(7) WebSocket(46) workers_spawned(535) workers_terminated(526)
Builtins: "bun:main" "node:assert" "node:buffer" "node:child_process" "node:constants" "node:crypto" "node:dns" "node:events" "node:fs" "node:fs/promises" "node:http" "node:https" "node:module" "node:net" "node:os" "node:path" "node:perf_hooks" "node:process" "node:readline" "node:stream" "node:string_decoder" "node:tls" "node:tty" "node:url" "node:util" "node:util/types" "node:zlib" "node:worker_threads" "ws"
Elapsed: 602224ms | User: 162422ms | Sys: 36862ms
RSS: 0.02ZB | Peak: 1.74GB | Commit: 0.02ZB | Faults: 36625

panic: Segmentation fault at address 0x325AE8FB8
oh no: Bun has crashed. This indicates a bug in Bun, not your code.

To send a redacted crash report to Bun's team,
please file a GitHub issue using the link below:

 https://bun.report/1.1.27/la1267afa2G6qig0LuzlxlE+xpRi54g2Cmxkl8C4+x+1CwyptlDo/qlgD4zmlgD6z443Ds7yojFA2Gw7n61lB

Stack Trace (bun.report)

Bun v1.1.27 (267afa2) on linux x86_64 [AutoCommand]

Segmentation fault at address 0x325AE8FB8

@ImBIOS ImBIOS added the crash An issue that could cause a crash label Sep 11, 2024
@github-actions github-actions bot added linux An issue that occurs on Linux runtime labels Sep 11, 2024
@Jarred-Sumner
Copy link
Collaborator

Does this crash occur if you don't use Worker?

@ImBIOS
Copy link
Contributor Author

ImBIOS commented Sep 11, 2024

In my codebase itself, I'm not using Bun's Worker (I've tried it, but it caused another roadblock, so I'm using puppeteer-cluster), maybe Worker automatically used by puppeteer-cluster (?)

Are you (@Jarred-Sumner) asking me to try running puppeteer without puppeteer-cluster?

@ImBIOS
Copy link
Contributor Author

ImBIOS commented Sep 11, 2024

UPDATE 1: It never happened before, I've been scraping some (less than a thousand) pages each scraping run, this whole 2 months. It's my first time scrape more than 1000 pages in a single run, this whole 2 months. Using Bun, I love ❤️ bun.

@Jarred-Sumner
Copy link
Collaborator

Tracking this in #15964

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
crash An issue that could cause a crash linux An issue that occurs on Linux runtime
Projects
None yet
Development

No branches or pull requests

2 participants