使用python诊断代理问题

问题描述 投票:0回答:2

因此,我正在尝试使用python 2.7进行各种需要从Internet提取数据的事情。我不是很成功,我正在寻求帮助来诊断我做错了。

[首先,我通过定义代理pip install --proxy=http://username:[email protected]:8080 numpy设法使点子起作用。 因此python必须能够通过它!

然而,实际上编写可以执行相同操作的.py脚本时,我没有成功。我首先尝试对urllib2使用以下代码:

import urllib2

uri = "http://www.python.org"
http_proxy_server = "someproxyserver.com"
http_proxy_port = "8080"
http_proxy_realm = http_proxy_server
http_proxy_user = "username"
http_proxy_passwd = "password"

# Next line = "http://username:[email protected]:8080"
http_proxy_full_auth_string = "http://%s:%s@%s:%s" % (http_proxy_user,
                                                      http_proxy_passwd,
                                                      http_proxy_server,
                                                      http_proxy_port)

def open_url_no_proxy():
    urllib2.urlopen(uri)

    print "Apparent success without proxy server!"    

def open_url_installed_opener():
    proxy_handler = urllib2.ProxyHandler({"http": http_proxy_full_auth_string})

    opener = urllib2.build_opener(proxy_handler)
    urllib2.install_opener(opener)
    urllib2.urlopen(uri)

    print "Apparent success through proxy server!"

if __name__ == "__main__":
    open_url_no_proxy()
    open_url_installed_opener()

但是我刚收到此错误:

URLError: <urlopen error [Errno 10060] A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond>

然后我尝试使用urllib3,因为这是pip处理代理的模块:

from urllib3 import ProxyManager, make_headers

# Establish the Authentication Settings
default_headers = make_headers(basic_auth='username:password')
http = ProxyManager("https://www.proxy.com:8080/", headers=default_headers)

# Now you can use `http` as you would a normal PoolManager
r = http.request('GET', 'https://www.python.org/')

# Check data is from destination
print(r.data)

我收到此错误:

raise MaxRetryError(_pool, url, error or ResponseError(cause)) MaxRetryError: HTTPSConnectionPool(host='www.python.org', port=443): Max retries exceeded with url: / (Caused by ProxyError('Cannot connect to proxy.', error('Tunnel connection failed: 407 Proxy Authorization Required',)))

我非常感谢诊断此问题的任何帮助。

python-2.7 proxy urllib2 urllib3
2个回答
1
投票

我的问题的解决方案是使用请求模块,请参见以下线程:Proxies with Python 'Requests' module

mtt2p列出对我有用的这段代码。

import requests
import time
class BaseCheck():
    def __init__(self, url):
        self.http_proxy  = "http://user:pw@proxy:8080"
        self.https_proxy = "http://user:pw@proxy:8080"
        self.ftp_proxy   = "http://user:pw@proxy:8080"
        self.proxyDict = {
                      "http"  : self.http_proxy,
                      "https" : self.https_proxy,
                      "ftp"   : self.ftp_proxy
                    }
        self.url = url
        def makearr(tsteps):
            global stemps
            global steps
            stemps = {}
            for step in tsteps:
                stemps[step] = { 'start': 0, 'end': 0 }
            steps = tsteps
        makearr(['init','check'])
        def starttime(typ = ""):
            for stemp in stemps:
                if typ == "":
                    stemps[stemp]['start'] = time.time()
                else:
                    stemps[stemp][typ] = time.time()
        starttime()
    def __str__(self):
        return str(self.url)
    def getrequests(self):
        g=requests.get(self.url,proxies=self.proxyDict)
        print g.status_code
        print g.content
        print self.url
        stemps['init']['end'] = time.time()
        #print stemps['init']['end'] - stemps['init']['start']
        x= stemps['init']['end'] - stemps['init']['start']
        print x


test=BaseCheck(url='http://google.com')
test.getrequests()

0
投票

我认为您需要设置proxy_headers,而不是ProxyManager中的标题:

default_headers = urllib3.util.make_headers(proxy_basic_auth='user:passwd')
http = urllib3.ProxyManager(proxyUrl, proxy_headers=default_headers, ca_certs=certifi.where())
http.request('GET', url)

请参见文档中的urllib3.poolmanager.ProxyManager:https://urllib3.readthedocs.io/en/latest/reference/

© www.soinside.com 2019 - 2024. All rights reserved.