-
Notifications
You must be signed in to change notification settings - Fork 284
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
When crawling, all domains appear to be DOWN #490
Comments
Were you able to fix this? @sunil3590 |
@Terrtia |
Hello I have the same issue. Is there any update? thanks |
Maybe I found the error in the screen logs (screen -r Crawlers_AIL):
|
@TheFausap @Terrtia did you find the fix for this? i also cant crawl any onion domain since they appear to be down |
ISSUE
I tried to crawl a regular domain (not .onion) and the status fo the domain comes up as DOWN. I've tried this will multiple domains and even .onion domains but the result is the same, all domains are DOWN.
SETUP
I have AIL, TOR, and Splash all installed and running on a single machine with one docker instance of Splash running on 8050 and Tor running on 9050
Logs from Splash Docker
The line of code in Splash generating the error message above
https://github.com/scrapinghub/splash/blob/9fda128b8485dd5f67eb103cd30df8f325a90bb0/splash/engines/webkit/browser_tab.py#L446
The text was updated successfully, but these errors were encountered: