We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Please give me your valuable comments! 如果可以的话请提出宝贵意见!
usage: tumblr-crawler.py [-h] [-p] [-v] [-d SAVE_DIR] [-x PROXY] [-n THREAD_NUM] [--min MIN_SIZE] [--overwrite] [--interval INTERVAL] [--retries RETRIES] sites [sites ...] Crawler Tumblr Photos and Videos positional arguments: sites tumblr sites optional arguments: -h, --help show this help message and exit -p, --photo whether to download photo -v, --video whether to download video -d SAVE_DIR, --dir SAVE_DIR download file save directory -x PROXY, --proxy PROXY http request agent, support http/socks -n THREAD_NUM, --thread THREAD_NUM number of download threads, default is 5 --min MIN_SIZE minimum size of downloaded files, default is 0k (unlimited) --overwrite overwrite file (if it exists) --interval INTERVAL http request interval, default is 0.5 (seconds) --retries RETRIES http request retries, default is 3
The text was updated successfully, but these errors were encountered:
No branches or pull requests
tumblr-crawler-cli
Feature
特性
Usage/用法
The text was updated successfully, but these errors were encountered: