python urllib2.urlopen.read() time out?

用python2.7,urllib2写了一个读取网页的脚本,大概内容是这样的:

...
try:
    temp_url=urllib2.urlopen(url,timeout=2)
except:
    return
res=temp_url.read()
...

在延迟800ms~1200ms,丢包30%的模拟网络环境中持续不停跑了大半天,然后read()报了个time out异常

...
  File "../scripts/_readurl.py", line 50, in text2mp3
    voice_data=temp_url.read()
  File "/usr/lib/python2.7/socket.py", line 355, in read
    data = self._sock.recv(rbufsize)
  File "/usr/lib/python2.7/httplib.py", line 612, in read
    s = self.fp.read(amt)
  File "/usr/lib/python2.7/socket.py", line 384, in read
    data = self._sock.recv(left)
timeout: timed out

我知道urlopen()有time out一说,怎么read也能time out啊?
一直认为urlopen会把网页下载下来,然后用read获取,难道不是这样的?

阅读 4.1k
1 个回答

是不是可以改成这样?
...
try:

temp_url=urllib2.urlopen(url,timeout=2)
res=temp_url.read()

except:

return

...

撰写回答
你尚未登录,登录后可以
  • 和开发者交流问题的细节
  • 关注并接收问题和回答的更新提醒
  • 参与内容的编辑和改进,让解决方法与时俱进