Set each timeout individually and use a session-timeout of two days
(we want to avoid hanging crawls forever, but we don't want to prevent the downloading of large files from very slow hosts.)
This commit is contained in:
parent
016a166f14
commit
659b25481e
@ -1 +1 @@
|
||||
__version__ = '1.1.0'
|
||||
__version__ = '1.1.1'
|
||||
|
@ -192,7 +192,10 @@ which_wpull_command):
|
||||
"--no-check-certificate",
|
||||
"--no-robots",
|
||||
"--inet4-only",
|
||||
"--timeout", "20",
|
||||
"--dns-timeout", "20",
|
||||
"--connect-timeout", "20",
|
||||
"--read-timeout", "900",
|
||||
"--session-timeout", str(86400 * 2),
|
||||
"--tries", "3",
|
||||
"--waitretry", "5",
|
||||
"--max-redirect", "8",
|
||||
|
Loading…
x
Reference in New Issue
Block a user