17

Apologies for the very long post, but I'm really trying to be thorough...

I have a dedicated web site that serves as bridge to exchange data between various environmental models operated from remote servers and running on different types of OSes (Linux, MacOS and Windows). Basically each server can upload/download data files to the web site, and files are then used for further processing with a different model on another server.

The web sites has some basic protection (IP filtering, password and SSL using LetsEncrypt certificates). All the remote servers can access the site and upload/download data through a simple web interface that we have created.

Now we are trying to automate some of the exchange with a simple python (2.7) daemon (based on the requests module). The daemon monitors certain folders and uploads the content to the web site.

The daemon works fine on all of the remote servers, except for one running Windows 7 Enterprise 64bit. This server has Python 2.7.13 installed and the following packages: DateTime (4.1.1), psutil (5.2.0), pytz (2016.10), requests (2.13.0), zope.interface (4.3.3).

From this server the SSL connection works fine through a web browser, but the daemon always returns:

raise SSLError(e, request=request)
requests.exceptions.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:661)

Here is what we tried so far:

  • setting verify=false. This works fine, but we cannot use it in our final production environment..
  • copying the certificate from another server where the daemon works, and setting verify=(name of the certificate file) (no success)
  • setting the 'User-agent' to the exact same string that we get from the Windows machine on the web site when the connection is done with a web browser (no success)

What other setting should we be looking at on the Windows server to try to solve the problem? Can it be a firewall setting that somehow allows the browsers SSL connection through but blocks the python daemon?

UPDATE
The organization that is running the Windows remote server that was producing the error substitutes all SSL certificates at the proxy level.
Their IT people solved our problem by adding the URL of our web site to the list of "pass through" sites on their proxy settings.

This works and it's fine for now. However I'm wondering if we could have handled the certificate substitution directly in python...

2 Answers 2

29

It is possible to get the Requests library to use Python's inbuilt ssl module to make the SSL portion of the HTTP connection. This is doable because the urllib3 utils that Requests uses allow passing a Python SSLContext into them.

However, note that this may depend on the necessary certificates already being loaded into the trust store based on a previous Windows access (see this comment)

Some sample code follows (this needs a recent version of Requests; it works with 2.18.4):

import requests
from requests.adapters import HTTPAdapter
from requests.packages.urllib3.util.ssl_ import create_urllib3_context

class SSLContextAdapter(HTTPAdapter):
    def init_poolmanager(self, *args, **kwargs):
        context = create_urllib3_context()
        kwargs['ssl_context'] = context
        context.load_default_certs() # this loads the OS defaults on Windows
        return super(SSLContextAdapter, self).init_poolmanager(*args, **kwargs)

s = requests.Session()
adapter = SSLContextAdapter()
s.mount('https://myinternalsite', adapter)
response = s.get('https://myinternalsite')
Sign up to request clarification or add additional context in comments.

11 Comments

Doesn't work for me. I'm using Requests v2.19.1, and it gives me this error: 'PyOpenSSLContext' object has no attribute 'load_default_certs'
For Python 3.6.5 and requests 2.19.1, I had to replace the create_urllib3_context import with import ssland then change the context's assignment to be context = ssl.create_default_context().
Thanks for the feedback. On Windows 10, using Python 2.7.15 and requests 2.21.0, the create_urllib3_context import and code above still works correctly for me.
I had it working with Python3 and only tried it against requests 2.12.4 and 2.19.1 . I removed my comment as @jakob.j has the better solution.
@Josh, Every call to requests.get(), etc. always uses a session behind the scenes. So, there's no way to use requests without a session.
|
1

Requests doesn't use your Windows root CA store like your browser does.

From the docs: By default, Requests bundles a set of root CAs that it trusts, sourced from the Mozilla trust store. However, these are only updated once for each Requests version.

This list of trusted CAs can also be specified through the REQUESTS_CA_BUNDLE environment variable.

You can literally do this:

cafile = 'cacert.pem' # http://curl.haxx.se/ca/cacert.pem
r = requests.get(url, verify=cafile)

Or you can use certifi if your CA cert is signed by a public entity.

9 Comments

Thanks for your reply. We already tried this approach (second bullet point), but without success. I edited my question to make sure it's cleared.
Which certificate did you copy? You need the issuing certificate authority cert, not the webserver cert.
We copied the cacert.pem file from another Windows server where the daemon works. From C:\Python27\lib\site-packages\requests\cacert.pem
If it is working on another machine, with the same code, then there's something up with the file, like getting mangled on copy. Otherwise I'm not sure.
The file is fine... we tried the same file on other Windows machines to double check. That's why we are questioning other possible settings at the OS level...
|

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.