utils

This module provides utility functions that are used within Requests that are also useful for external consumption.

Module Contents

Functions

dict_to_sequence(d) Returns an internal sequence dictionary update.
super_len(o)
get_netrc_auth(url,raise_errors=False) Returns the Requests tuple auth for a given url from netrc.
guess_filename(obj) Tries to guess the filename of the given object.
from_key_val_list(value) Take an object and test to see if it can be represented as a
to_key_val_list(value) Take an object and test to see if it can be represented as a
parse_list_header(value) Parse lists as described by RFC 2068 Section 2.
parse_dict_header(value) Parse lists of key, value pairs as described by RFC 2068 Section 2 and
unquote_header_value(value,is_filename=False) rUnquotes a header value. (Reversal of quote_header_value()).
dict_from_cookiejar(cj) Returns a key/value dictionary from a CookieJar.
add_dict_to_cookiejar(cj,cookie_dict) Returns a CookieJar from a key/value dictionary.
get_encodings_from_content(content) Returns encodings from given content string.
get_encoding_from_headers(headers) Returns encodings from given HTTP Header Dict.
stream_decode_response_unicode(iterator,r) Stream decodes a iterator.
iter_slices(string,slice_length) Iterate over slices of a string.
get_unicode_from_response(r) Returns the requested content back in unicode.
unquote_unreserved(uri) Un-escape any percent-escape sequences in a URI that are unreserved
requote_uri(uri) Re-quote the given URI.
address_in_network(ip,net) This function allows you to check if on IP belongs to a network subnet
dotted_netmask(mask) Converts mask from /xx format to xxx.xxx.xxx.xxx
is_ipv4_address(string_ip)
is_valid_cidr(string_network) Very simple check of the cidr format in no_proxy variable
should_bypass_proxies(url) Returns whether we should bypass proxies or not.
get_environ_proxies(url) Return a dict of environment proxies.
select_proxy(url,proxies) Select a proxy for the url, if applicable.
default_user_agent(name=”python-requests”) Return a string representing the default user agent.
default_headers()
parse_header_links(value) Return a dict of parsed link headers proxies.
guess_json_utf(data)
prepend_scheme_if_needed(url,new_scheme) Given a URL that may or may not have a scheme, prepend the given scheme.
get_auth_from_url(url) Given a url with authentication components, extract them into a tuple of
to_native_string(string,encoding=”ascii”) Given a string object, regardless of type, returns a representation of that
urldefragauth(url) Given a url remove the fragment and the authentication part
dict_to_sequence(d)

Returns an internal sequence dictionary update.

super_len(o)
get_netrc_auth(url, raise_errors=False)

Returns the Requests tuple auth for a given url from netrc.

guess_filename(obj)

Tries to guess the filename of the given object.

from_key_val_list(value)

Take an object and test to see if it can be represented as a dictionary. Unless it can not be represented as such, return an OrderedDict, e.g.,

>>> from_key_val_list([('key', 'val')])
OrderedDict([('key', 'val')])
>>> from_key_val_list('string')
ValueError: need more than 1 value to unpack
>>> from_key_val_list({'key': 'val'})
OrderedDict([('key', 'val')])
to_key_val_list(value)

Take an object and test to see if it can be represented as a dictionary. If it can be, return a list of tuples, e.g.,

>>> to_key_val_list([('key', 'val')])
[('key', 'val')]
>>> to_key_val_list({'key': 'val'})
[('key', 'val')]
>>> to_key_val_list('string')
ValueError: cannot encode objects that are not 2-tuples.
parse_list_header(value)

Parse lists as described by RFC 2068 Section 2.

In particular, parse comma-separated lists where the elements of the list may include quoted-strings. A quoted-string could contain a comma. A non-quoted string could have quotes in the middle. Quotes are removed automatically after parsing.

It basically works like parse_set_header() just that items may appear multiple times and case sensitivity is preserved.

The return value is a standard list:

>>> parse_list_header('token, "quoted value"')
['token', 'quoted value']

To create a header from the list again, use the dump_header() function.

Parameters:value – a string with a list header.
Returns:list
parse_dict_header(value)

Parse lists of key, value pairs as described by RFC 2068 Section 2 and convert them into a python dict:

>>> d = parse_dict_header('foo="is a fish", bar="as well"')
>>> type(d) is dict
True
>>> sorted(d.items())
[('bar', 'as well'), ('foo', 'is a fish')]

If there is no value for a key it will be None:

>>> parse_dict_header('key_without_value')
{'key_without_value': None}

To create a header from the dict again, use the dump_header() function.

Parameters:value – a string with a dict header.
Returns:dict
unquote_header_value(value, is_filename=False)

rUnquotes a header value. (Reversal of quote_header_value()). This does not use the real unquoting but what browsers are actually using for quoting.

Parameters:value – the header value to unquote.
dict_from_cookiejar(cj)

Returns a key/value dictionary from a CookieJar.

Parameters:cj – CookieJar object to extract cookies from.
add_dict_to_cookiejar(cj, cookie_dict)

Returns a CookieJar from a key/value dictionary.

Parameters:
  • cj – CookieJar to insert cookies into.
  • cookie_dict – Dict of key/values to insert into CookieJar.
get_encodings_from_content(content)

Returns encodings from given content string.

Parameters:content – bytestring to extract encodings from.
get_encoding_from_headers(headers)

Returns encodings from given HTTP Header Dict.

Parameters:headers – dictionary to extract encoding from.
stream_decode_response_unicode(iterator, r)

Stream decodes a iterator.

iter_slices(string, slice_length)

Iterate over slices of a string.

get_unicode_from_response(r)

Returns the requested content back in unicode.

Parameters:r – Response object to get unicode content from.

Tried:

  1. charset from content-type
  2. fall back and replace all unicode characters
unquote_unreserved(uri)

Un-escape any percent-escape sequences in a URI that are unreserved characters. This leaves all reserved, illegal and non-ASCII bytes encoded.

requote_uri(uri)

Re-quote the given URI.

This function passes the given URI through an unquote/quote cycle to ensure that it is fully and consistently quoted.

address_in_network(ip, net)

This function allows you to check if on IP belongs to a network subnet Example: returns True if ip = 192.168.1.1 and net = 192.168.1.0/24

returns False if ip = 192.168.1.1 and net = 192.168.100.0/24
dotted_netmask(mask)

Converts mask from /xx format to xxx.xxx.xxx.xxx Example: if mask is 24 function returns 255.255.255.0

is_ipv4_address(string_ip)
is_valid_cidr(string_network)

Very simple check of the cidr format in no_proxy variable

should_bypass_proxies(url)

Returns whether we should bypass proxies or not.

get_environ_proxies(url)

Return a dict of environment proxies.

select_proxy(url, proxies)

Select a proxy for the url, if applicable.

Parameters:
  • url – The url being for the request
  • proxies – A dictionary of schemes or schemes and hosts to proxy URLs
default_user_agent(name="python-requests")

Return a string representing the default user agent.

default_headers()

Return a dict of parsed link headers proxies.

i.e. Link: <http:/…/front.jpeg>; rel=front; type=”image/jpeg”,<http://…/back.jpeg>; rel=back;type=”image/jpeg”

guess_json_utf(data)
prepend_scheme_if_needed(url, new_scheme)

Given a URL that may or may not have a scheme, prepend the given scheme. Does not replace a present scheme with the one provided as an argument.

get_auth_from_url(url)

Given a url with authentication components, extract them into a tuple of username,password.

to_native_string(string, encoding="ascii")

Given a string object, regardless of type, returns a representation of that string in the native string type, encoding and decoding where necessary. This assumes ASCII unless told otherwise.

urldefragauth(url)

Given a url remove the fragment and the authentication part