爬虫携带登录后的cookie无法访问指定页面,但是用fiddler却可以,求救
相关代码
import requests
s_url = 'http://www.ylike.com/g/getSearchMemberList.do?area=%u4E0A%u6D77&sex=%u5973&quanzi=0&havephoto=%u662F&PageNo=3&classid=1&city=%u95F8%u5317&age1=18&age2=35&Shengao1=-1&Shengao2=-1&MarryState=&Nianxin1=-1&Nianxin2=-1&havecar=&havevideo=&isonline=&action=SearchMember&ContentMark=SearchMember&effecVal=2018-9-20%2011:23:07&pw=bfadd2d50f20e09c62bcc877ba1341a6'
uheaders = {
'Accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8',
'Accept-Encoding': 'gzip, deflate',
'Accept-Language': 'zh-CN,zh;q=0.9',
'Connection': 'keep-alive',
'Cookie': 'UM_distinctid=16565b61c095b7-0297c015021f1e-5701f32-1fa400-16565b61c0a487; save_ylike_login_name=13291413631; CNZZDATA1253398771=957043837-1537245476-http%253A%252F%252Fwww.ylike.com%252F%7C1537245476; Hm_lvt_05b65e4f73cb381dad867e4b7a93af1e=1537250600; ASPSESSIONIDCSDABRSC=IKFNOPFBGMBHJBFKDGDGBODH; CNZZDATA2294316=cnzz_eid%3D1189496189-1535005502-null%26ntime%3D1537408156; SkyMark=User%5Flog%5Ftmark=2018%2D9%2D20+10%3A47%3A50&User%5Flog%5Fcmark=%C9%CF%BA%A3%40%40%40%D5%A2%B1%B1&KUser%5FMark=0&User%5Fregt%5FMark=&User%5FOut%5FMark=2018%2D9%2D18+14%3A04%3A29&AdsMyIdFromURLto%5FCook=998; Sky=User%5FActivi%5FLastTime=2018%2D9%2D20+11%3A12%3A37&Froms=%C9%CF%BA%A3%40%40%40%D5%A2%B1%B1&User%5FPassword=72d9932b6bb7682defada5fe85483be8&Sex=%C4%D0&User%5FClassID=0&User%5FStates=1%240%242%24998&VipCid=%2D1&User%5FName=13291413631&User%5FID=25386196',
'Host': 'www.ylike.com',
'Referer': 'http://www.ylike.com/',
'Upgrade-Insecure-Requests': '1',
'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/69.0.3497.12 Safari/537.36',
}
res1=requests.get(url=s_url,headers=uheaders)
print(res1.status_code)
print(res1.content.decode('gb2312'))
控制台输出
200
<script language="javascript">alert("请在登录后使用!");window.location.href='/UserLogin.do?a=unlogin_getSearchMemberList';</script>
但是用Fiddler携带相同的headers就能获取到数据


很有可能是cookie里面的某个参数只能使用一次,你再次使用就失效了,而这个参数会在之前的网页里给你,抓包的时候可以尝试找一找。