欧美一级特黄大片做受成人-亚洲成人一区二区电影-激情熟女一区二区三区-日韩专区欧美专区国产专区

如何用python爬蟲批量抓取ip代理

這篇文章主要講解了“如何用python爬蟲批量抓取ip代理”,文中的講解內(nèi)容簡單清晰,易于學(xué)習(xí)與理解,下面請大家跟著小編的思路慢慢深入,一起來研究和學(xué)習(xí)“如何用python爬蟲批量抓取ip代理”吧!

創(chuàng)新互聯(lián)公司IDC提供業(yè)務(wù):服務(wù)器托管,成都服務(wù)器租用,服務(wù)器托管,重慶服務(wù)器租用等四川省內(nèi)主機托管與主機租用業(yè)務(wù);數(shù)據(jù)中心含:雙線機房,BGP機房,電信機房,移動機房,聯(lián)通機房。

因此寫一個python程序來獲取ip代理,保存到本地。

python版本:3.6.3

1 #grab ip proxies from xicidaili

 2 import sys, time, re, requests

 3 from multiprocessing.dummy import Pool as ThreadPool

 4 from lxml import etree

 5 

 6 IP_POOL = 'ip_pool.py'

 7 URL = 'http://www.xicidaili.com/nn/' #IP代理 高匿

 8 #URL = 'http://www.xicidaili.com/wt/' #IP代理 http

 9 RUN_TIME = time.strftime("%Y-%m-%d %H:%M", time.localtime()) #執(zhí)行時間

10 

11 #用字典存放有效ip代理

12 alive_ip = {'http': [], 'https': []}

13 #多線程

14 pool = ThreadPool(20)

15 

16 #返回html文本

17 def get_html(url):

18 headers = {

19 "User-Agent": "Mozilla/5.0 (Windows NT 10.0; WOW64; rv:55.0) Gecko/20100101 Firefox/55.0",

20 "Accept": "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8",

21 "Accept-Language": "zh-CN,zh;q=0.8,en-US;q=0.5,en;q=0.3",

22 "Accept-Encoding": "gzip, deflate",

23 "Referer": "https://www.xicidaili.com/",

24 "Connection": "keep-alive",

25 "Upgrade-Insecure-Requests": "1"

26 }

27 r = requests.get(url, headers=headers)

28 r.encoding = 'utf-8'

29 return r.text

30 

31 #測試ip代理是否存活

32 def test_alive(proxy):

33 global alive_ip

34 proxies = {'http': proxy}

35 try:

36 r = requests.get('https://www.baidu.com', proxies=proxies, timeout=3)

37 if r.status_code == 200:

38 if proxy.startswith('https'):

39 alive_ip['https'].append(proxy)

40 else:

41 alive_ip['http'].append(proxy)

42 except:

43 print("%s無效!"%proxy)

44 

45 #解析html文本,獲取ip代理

46 def get_alive_ip_address():

47 iplist = []

48 html = get_html(URL)

49 selector = etree.HTML(html)

50 table = selector.xpath('//table[@id="ip_list"]')[0]

51 lines = table.xpath('./tr')[1:]

52 for line in lines:

53 speed, connect_time = line.xpath('.//div/@title')

54 data = line.xpath('./td')

55 ip = data[1].xpath('./text()')[0]

56 port = data[2].xpath('./text()')[0]

57 anonymous = data[4].xpath('./text()')[0]

58 ip_type = data[5].xpath('./text()')[0]

59 #過濾掉速度慢和非高匿的ip代理

60 if float(speed[:-1])<1 or float(connect_time[:-1])<1 or anonymous != '高匿':

61 continue

62 iplist.append(ip_type.lower() + '://' + ip + ':' + port)

63 pool.map(test_alive, iplist)

64 

65 #把抓取到的有效ip代理寫入到本地

66 def write_txt(output_file):

67 with open(output_file, 'w') as f:

68 f.write('#create time: %s\n\n' % RUN_TIME)

69 f.write('http_ip_pool = \\\n')

70 f.write(str(alive_ip['http']).replace(',', ',\n'))

71 f.write('\n\n')

72 with open(output_file, 'a') as f:

73 f.write('https_ip_pool = \\\n')

74 f.write(str(alive_ip['https']).replace(',', ',\n'))

75 print('write successful: %s' % output_file)

76 

77 def main():

78 get_alive_ip_address()

79 write_txt(output_file)

80 

81 if __name__ == '__main__':

82 try:

83 output_file = sys.argv[1] #第一個參數(shù)作為文件名

84 except:

85 output_file = IP_POOL

86 main()

運行程序:

root@c:test$ python get_ip_proxies.py

write successful: ip_pool.py

查看文件:

root@c:test$ vim ip_pool.py

 1 #create time: 2019-03-14 19:53

 2 

 3 http_ip_pool = \

 4 ['http://183.148.152.1:9999',

 5 'http://112.85.165.234:9999',

 6 'http://112.87.69.162:9999',

 7 'http://111.77.197.10:9999',

 8 'http://113.64.94.80:8118',

 9 'http://61.184.109.33:61320',

10 'http://125.126.204.82:9999',

11 'http://125.126.218.8:9999',

12 'http://36.26.224.56:9999',

13 'http://123.162.168.192:40274',

14 'http://116.209.54.125:9999',

15 'http://183.148.148.211:9999',

16 'http://111.177.161.111:9999',

17 'http://116.209.58.245:9999',

18 'http://183.148.143.38:9999',

19 'http://116.209.55.218:9999',

20 'http://114.239.250.15:9999',

21 'http://116.209.54.109:9999',

22 'http://125.123.143.98:9999',

23 'http://183.6.130.6:8118',

24 'http://183.148.143.166:9999',

25 'http://125.126.203.228:9999',

26 'http://111.79.198.74:9999',

27 'http://116.209.53.215:9999',

28 'http://112.87.69.124:9999',

29 'http://112.80.198.13:8123',

30 'http://182.88.160.16:8123',

31 'http://116.209.56.24:9999',

32 'http://112.85.131.25:9999',

33 'http://116.209.52.234:9999',

34 'http://175.165.128.223:1133',

35 'http://122.4.47.199:8010',

36 'http://112.85.170.204:9999',

37 'http://49.86.178.206:9999',

38 'http://125.126.215.187:9999']

39 

40 https_ip_pool = \

41 ['https://183.148.156.98:9999',

42 'https://111.79.199.167:808',

43 'https://61.142.72.150:39894',

44 'https://119.254.94.71:42788',

45 'https://221.218.102.146:33323',

46 'https://122.193.246.29:9999',

47 'https://183.148.139.173:9999',

48 'https://60.184.194.157:3128',

49 'https://118.89.138.129:52699',

50 'https://112.87.71.67:9999',

51 'https://58.56.108.226:43296',

52 'https://182.207.232.135:50465',

53 'https://111.177.186.32:9999',

54 'https://58.210.133.98:32741',

55 'https://115.221.116.71:9999',

56 'https://183.148.140.191:9999',

57 'https://183.148.130.143:9999',

58 'https://116.209.54.84:9999',

59 'https://125.126.219.125:9999',

60 'https://112.85.167.158:9999',

61 'https://112.85.173.76:9999',

62 'https://60.173.244.133:41306',

63 'https://183.148.147.223:9999',

64 'https://116.209.53.68:9999',

65 'https://111.79.198.102:9999',

66 'https://123.188.5.11:1133',

67 'https://60.190.66.131:56882',

68 'https://112.85.168.140:9999',

69 'https://110.250.65.108:8118',

70 'https://221.208.39.160:8118',

71 'https://116.209.53.77:9999',

72 'https://116.209.58.29:9999',

73 'https://183.148.141.129:9999',

74 'https://124.89.33.59:53281',

75 'https://116.209.57.149:9999',

76 'https://58.62.238.150:32431',

77 'https://218.76.253.201:61408']

之后就可以直接使用了

from ip_pool import http_ip_pool, https_ip_pool

感謝各位的閱讀,以上就是“如何用python爬蟲批量抓取ip代理”的內(nèi)容了,經(jīng)過本文的學(xué)習(xí)后,相信大家對如何用python爬蟲批量抓取ip代理這一問題有了更深刻的體會,具體使用情況還需要大家實踐驗證。這里是創(chuàng)新互聯(lián),小編將為大家推送更多相關(guān)知識點的文章,歡迎關(guān)注!

當(dāng)前名稱:如何用python爬蟲批量抓取ip代理
網(wǎng)站鏈接:http://aaarwkj.com/article44/igosee.html

成都網(wǎng)站建設(shè)公司_創(chuàng)新互聯(lián),為您提供品牌網(wǎng)站制作定制開發(fā)、服務(wù)器托管、標(biāo)簽優(yōu)化、App開發(fā)域名注冊

廣告

聲明:本網(wǎng)站發(fā)布的內(nèi)容(圖片、視頻和文字)以用戶投稿、用戶轉(zhuǎn)載內(nèi)容為主,如果涉及侵權(quán)請盡快告知,我們將會在第一時間刪除。文章觀點不代表本網(wǎng)站立場,如需處理請聯(lián)系客服。電話:028-86922220;郵箱:631063699@qq.com。內(nèi)容未經(jīng)允許不得轉(zhuǎn)載,或轉(zhuǎn)載時需注明來源: 創(chuàng)新互聯(lián)

網(wǎng)站建設(shè)網(wǎng)站維護公司
国产情色自拍在线观看| 日韩久久精品免费视频| 日韩不卡区免费在线观看| 成年人在线免费观看国产| 成人激情视频在线网页| 伊人久久大香线蕉av网站| 免费欧美一级黄片播放| 日韩美女后入式在线视频| 日韩一级黄色片在线播放| 亚洲欧美一区二区色慰| 亚洲精品成人免费电影| 日本午夜福利免费在线播放| 国产中文精品字幕a区| 欧美大吊视频在线观看| 色婷婷一区二区三区影片| 成人中文字幕av电影| 精品一区二区三区亚洲| 国产一级二级三级黄色| 在线日韩中文字幕二区 | av色狠狠一区二区三区| 亚洲av蜜臀在线播放| 欧美亚洲五月婷婷激情| 国产熟女肥臀精品国产馆乱| 亚洲邻家人妻一区二区| 国产精品一区二区婷婷| 亚洲一级香蕉视频东京热| 成人深夜福利视频在线| 欧美日韩一区二区黄色| 人妻精品久久一区二区三区| 亚洲欧美日韩一区中文字幕| 亚洲国产精品日韩专区av有中文 | 外国男人搞亚洲女人在线| 中文字幕av一区二区人妻| 亚洲国产99在线精品一区| 欧美精品蜜桃激情一区久久| 三欲一区二区三区中文字幕| 日韩欧美亚洲精品中文字幕αv| 欧美成人精品资源在线观看| 亚洲精品露脸自拍高清在线观看| 成人激情视频在线观看| 日本免费精品一区二区三区中 |