Skip to content

Python Async Bangumi Crawler Python Bangumi 异步爬虫

License

Notifications You must be signed in to change notification settings

AkiyaKiko/BangumiCrawler

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

BangumiCrawler

Python Async Bangumi Crawler
Python 异步 Bangumi(番组计划/班固米) 爬虫

Description 介绍

This project is one of the author's final course designs. So it doesn't have any timeliness and for reference only.
本项目为作者的期末课程设计之一,不具备时效性,仅供参考。

Usage 使用方法

Step 1

pip install -r requirements.txt

Step 2

python crawler.py

Now the two csv data set will be saved in data folder.
现在两个csv格式的数据集将会存放在data文件夹中。

Modules 模块

  • crawler.py: Main crawler script.

License

MIT License

About

Python Async Bangumi Crawler Python Bangumi 异步爬虫

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages