yes, i use a urlopen to crawl the some site. 
like this:
   html = urlopen(page)
   bs0bj = BeautifulSoup(html, "html.parser")

and error happen  in " /usr/lib/python3.5/urllib/request.py in do_open, "

when i open 'the site', the loading time is longer than others. then others 
which i crawl dont happen error 110.
so i think the site loading time is the source of error. maybe the loading 
time exceed the timeout deadline.

i want to length the timeout deadline, like timeout = None...


2018년 1월 9일 화요일 오전 1시 49분 56초 UTC+9, suas...@gmail.com 님의 말:
>
>
> I use pythonanywhere. At this time, "urlopen error [Errno 110] Connection 
> timed out" occurs. So I set CACHES TIMEOUT to None in Django settings.py. 
> However, error 110 still occurs. Do I need to change the timeout value in 
> urllib / request.py? I would really appreciate your help in this matter.
>

-- 
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to django-users+unsubscr...@googlegroups.com.
To post to this group, send email to django-users@googlegroups.com.
Visit this group at https://groups.google.com/group/django-users.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/django-users/7838022c-886f-4cf9-8da7-5dc44c41551f%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to