有一些数据我们是没法直观的查看的,需要通过抓取去获得。听到指数这个词,有的小伙伴们觉得很复杂,似乎只在股票的时候才听说的,比如一些数据的涨跌分析都是比较棘手的问题。不过指数对于我们的数据分析还是很有帮助的,今天小编就python爬虫中抓取指数得方法给大家带来讲解。
刚好这几天需要用到这个爬虫,结果发现baidu指数的请求有点变化,所以就改了改:
import requests import sys import time word_url = \'http://index.baidu.com/api/SearchApi/thumbnail?area=0&word={}\' COOKIES = \'\' headers = { \'Accept\': \'application/json, text/plain, */*\', \'Accept-Encoding\': \'gzip, deflate\', \'Accept-Language\': \'zh-CN,zh;q=0.9\', \'Cache-Control\': \'no-cache\', \'Cookie\': COOKIES, \'DNT\': \'1\', \'Host\': \'index.baidu.com\', \'Pragma\': \'no-cache\', \'Proxy-Connection\': \'keep-alive\', \'Referer\': \'http://index.baidu.com/v2/main/index.html\', \'User-Agent\': \'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_13_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/75.0.3770.90 Safari/537.36\', \'X-Requested-With\': \'XMLHttpRequest\', } def decrypt(t,e): n = list(t) i = list(e) a = {} result = [] ln = int(len(n)/2) start = n[ln:] end = n[:ln] for j,k in zip(start, end): a.update({k: j}) for j in e: result.append(a.get(j)) return \'\'.join(result) def get_ptbk(uniqid): url = \'http://index.baidu.com/Interface/ptbk?uniqid={}\' resp = requests.get(url.format(uniqid), headers=headers) if resp.status_code != 200: print(\'获取uniqid失败\') sys.exit(1) return resp.json().get(\'data\') def get_index_data(keyword, start=\'2011-01-03\', end=\'2019-08-05\'): keyword = str(keyword).replace(\"\'\", \'\"\') url = f\'http://index.baidu.com/api/SearchApi/index?area=0&word={keyword}&area=0&startDate={start}&endDate={end}\' resp = requests.get(url, headers=headers) print(\'获取指数失败\') content = resp.json() data = content.get(\'data\') user_indexes = data.get(\'userIndexes\')[0] uniqid = data.get(\'uniqid\') ptbk = get_ptbk(uniqid) while ptbk is None or ptbk == \'\': ptbk = get_ptbk(uniqid) all_data = user_indexes.get(\'all\').get(\'data\') result = decrypt(ptbk, all_data) result = result.split(\',\') print(result) if __name__ == \'__main__\': words = [[{\"name\": \"酷安\", \"wordType\": 1}]] get_index_data(words)
输出:
运行代码就可以得到我们想要的指数了,当然也可以用来看股票以及其他的一些操作,运用python爬虫解决都是不错的选择,感兴趣的小伙伴也可以跟着小编尝试一下。
© 版权声明
THE END
暂无评论内容