-
Notifications
You must be signed in to change notification settings - Fork 48
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix deprecated calls to scrapy.utils.request.request_fingerprint
#50
base: master
Are you sure you want to change the base?
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -5,7 +5,7 @@ | |
|
||
from scrapy.http import Request | ||
from scrapy.item import Item | ||
from scrapy.utils.request import request_fingerprint | ||
from scrapy.utils.request import fingerprint | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. On Scrapy 2.7+, the right approach is using There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. @Gallaecio wouldn't using this require storing an instance of (and instantiating) a from the messages in scrapy 2.11.2 (https://github.com/scrapy/scrapy/blob/e8cb5a03b382b98f2c8945355076390f708b918d/scrapy/utils/request.py#L86-L136) it seems to suggest getting the crawler during instantiation with but what if the There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
Most Scrapy components are instantiated with the It does happen in tests, that use |
||
from scrapy.utils.project import data_path | ||
from scrapy.utils.python import to_bytes | ||
from scrapy.exceptions import NotConfigured | ||
|
@@ -79,7 +79,7 @@ def process_spider_output(self, response, result, spider): | |
yield r | ||
|
||
def _get_key(self, request): | ||
key = request.meta.get('deltafetch_key') or request_fingerprint(request) | ||
key = request.meta.get('deltafetch_key') or fingerprint(request) | ||
return to_bytes(key) | ||
|
||
def _is_enabled_for_request(self, request): | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
3.5 and 3.6 don't seem to work anymore? at least in my fork